2025-07-28 09:44:18,797 [ 147138 ] INFO : ClickHouse root is not set. Will use /home/ubuntu/_work/ClickHouse/ClickHouse (runner:53, check_args_and_update_paths) 2025-07-28 09:44:18,798 [ 147138 ] INFO : Cases dir is not set. Will use /home/ubuntu/_work/ClickHouse/ClickHouse/tests/integration (runner:79, check_args_and_update_paths) 2025-07-28 09:44:18,798 [ 147138 ] INFO : utils dir is not set. Will use /home/ubuntu/_work/ClickHouse/ClickHouse/utils (runner:90, check_args_and_update_paths) 2025-07-28 09:44:18,798 [ 147138 ] INFO : base_configs_dir: /home/ubuntu/_work/ClickHouse/ClickHouse/programs/server, binary: /home/ubuntu/_work/_temp/test/build/clickhouse, cases_dir: /home/ubuntu/_work/ClickHouse/ClickHouse/tests/integration (runner:92, check_args_and_update_paths) clickhouse_integration_tests_volume Running pytest container as: 'docker run --rm --name clickhouse_integration_tests_42qml6 --privileged --dns-search='.' --memory=30709026816 --security-opt seccomp=unconfined --cap-add=SYS_PTRACE --volume=/home/ubuntu/_work/_temp/test/build/clickhouse:/clickhouse --volume=/home/ubuntu/_work/ClickHouse/ClickHouse/programs/server:/clickhouse-config --volume=/home/ubuntu/_work/ClickHouse/ClickHouse/tests/integration:/ClickHouse/tests/integration --volume=/home/ubuntu/_work/ClickHouse/ClickHouse/utils/backupview:/ClickHouse/utils/backupview --volume=/home/ubuntu/_work/ClickHouse/ClickHouse/utils/grpc-client/pb2:/ClickHouse/utils/grpc-client/pb2 --volume=/run:/run/host:ro --volume=clickhouse_integration_tests_volume:/var/lib/docker -e DOCKER_DOTNET_CLIENT_TAG=11de0b29a15d -e DOCKER_HELPER_TAG=5dc43a6382f0 -e DOCKER_BASE_TAG=5ccda723c1fc -e DOCKER_KERBEROS_KDC_TAG=9391ecdee8d7 -e DOCKER_MYSQL_GOLANG_CLIENT_TAG=9bec2a638e6e -e DOCKER_MYSQL_JAVA_CLIENT_TAG=766bff31cfe4 -e DOCKER_MYSQL_JS_CLIENT_TAG=41ba7c2ec2a1 -e DOCKER_MYSQL_PHP_CLIENT_TAG=88be89c1e3b6 -e DOCKER_NGINX_DAV_TAG=b55ac9cd7519 -e DOCKER_POSTGRESQL_JAVA_CLIENT_TAG=a4eff5c7f4d6 -e DOCKER_PYTHON_BOTTLE_TAG=d862517635bf -e DOCKER_CLIENT_TIMEOUT=300 -e COMPOSE_HTTP_TIMEOUT=600 -e CLICKHOUSE_USE_OLD_ANALYZER=1 -e PYTHONUNBUFFERED=1 -e PYTEST_ADDOPTS="--dist=loadfile -n 10 -rfEps --run-id=1 --color=no --durations=0 --report-log=parallel0_1.jsonl --report-log-exclude-logs-on-passed-tests test_database_delta/test.py::test_complex_table_schema test_database_delta/test.py::test_embedded_database_and_tables test_database_delta/test.py::test_multiple_schemes_tables -vvv " altinityinfra/integration-tests-runner:226bfaf75ac1 '. Start tests ============================= test session starts ============================== platform linux -- Python 3.10.12, pytest-7.4.4, pluggy-1.5.0 -- /usr/bin/python3 cachedir: .pytest_cache Test order randomisation NOT enabled. Enable with --random-order or --random-order-bucket= rootdir: /ClickHouse/tests/integration configfile: pytest.ini plugins: timeout-2.3.1, repeat-0.9.3, order-1.0.0, reportlog-0.4.0, xdist-3.5.0, random-order-1.1.1 timeout: 900.0s timeout method: signal timeout func_only: False created: 10/10 workers 10 workers [3 items] scheduling tests via LoadFileScheduling test_database_delta/test.py::test_complex_table_schema [gw1] [ 33%] FAILED test_database_delta/test.py::test_complex_table_schema test_database_delta/test.py::test_embedded_database_and_tables [gw1] [ 66%] FAILED test_database_delta/test.py::test_embedded_database_and_tables test_database_delta/test.py::test_multiple_schemes_tables [gw1] [100%] FAILED test_database_delta/test.py::test_multiple_schemes_tables =================================== FAILURES =================================== __________________________ test_complex_table_schema ___________________________ [gw1] linux -- Python 3.10.12 /usr/bin/python3 started_cluster = def test_complex_table_schema(started_cluster): node1 = started_cluster.instances['node1'] execute_spark_query(node1, "CREATE SCHEMA schema_with_complex_tables", ignore_exit_code=True) schema = "event_date DATE, event_time TIMESTAMP, hits ARRAY, ids MAP, really_complex STRUCT" create_query = f"CREATE TABLE schema_with_complex_tables.complex_table ({schema}) using Delta location '/tmp/complex_schema/complex_table'" execute_spark_query(node1, create_query, ignore_exit_code=True) execute_spark_query(node1, "insert into schema_with_complex_tables.complex_table SELECT to_date('2024-10-01', 'yyyy-MM-dd'), to_timestamp('2024-10-01 00:12:00'), array(42, 123, 77), map(7, 'v7', 5, 'v5'), named_struct(\\\"f1\\\", 34, \\\"f2\\\", 'hello')", ignore_exit_code=True) node1.query("create database complex_schema engine DataLakeCatalog('http://localhost:8080/api/2.1/unity-catalog') settings warehouse = 'unity', catalog_type='unity', vended_credentials=false", settings={"allow_experimental_database_unity_catalog": "1"}) complex_schema_tables = list(sorted(node1.query("SHOW TABLES FROM complex_schema LIKE 'schema_with_complex_tables%'", settings={'use_hive_partitioning':'0'}).strip().split('\n'))) assert len(complex_schema_tables) == 1 > print(node1.query("SHOW CREATE TABLE complex_schema.`schema_with_complex_tables.complex_table`")) test_database_delta/test.py:125: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ helpers/cluster.py:3649: in query return self.client.query( helpers/client.py:39: in wrap return func(self, *args, **kwargs) helpers/client.py:79: in query ).get_answer() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def get_answer(self): self.process.wait(timeout=DEFAULT_QUERY_TIMEOUT) self.stdout_file.seek(0) self.stderr_file.seek(0) stdout = self.stdout_file.read().decode("utf-8", errors="replace") stderr = self.stderr_file.read().decode("utf-8", errors="replace") if ( self.timer is not None and not self.process_finished_before_timeout and not self.ignore_error ): logging.debug(f"Timed out. Last stdout:{stdout}, stderr:{stderr}") raise QueryTimeoutExceedException("Client timed out!") if ( self.process.returncode != 0 or self.remove_trash_from_stderr(stderr) ) and not self.ignore_error: > raise QueryRuntimeException( "Client failed! Return code: {}, stderr: {}".format( self.process.returncode, stderr ), self.process.returncode, stderr, ) E helpers.client.QueryRuntimeException: Client failed! Return code: 86, stderr: Received exception from server (version 25.3.6): E Code: 86. DB::Exception: Received from 172.16.1.2:9000. DB::HTTPException. DB::HTTPException: Received error from remote server http://localhost:8080/api/2.1/unity-catalog/tables/unity.schema_with_complex_tables.complex_table. HTTP status code: 404 'Not Found', body length: 182 bytes, body: '{"error_code":"NOT_FOUND","details":[{"reason":"NOT_FOUND","metadata":{},"@type":"google.rpc.ErrorInfo"}],"stack_trace":null,"message":"Schema not found: schema_with_complex_tables"}': while parsing JSON: . Stack trace: E E 0. ./contrib/llvm-project/libcxx/include/__exception/exception.h:113: Poco::Exception::Exception(String const&, int) @ 0x00000000382e5051 E 1. ./build_docker/./src/Common/Exception.cpp:108: DB::Exception::Exception(DB::Exception::MessageMasked&&, int, bool) @ 0x000000001bd54ed1 E 2. DB::Exception::Exception(PreformattedMessage&&, int) @ 0x000000000c38e20b E 3. ./src/Common/Exception.h:130: DB::HTTPException::makeExceptionMessage(int, String const&, Poco::Net::HTTPResponse::HTTPStatus, String const&, String const&) @ 0x000000001c500c1e E 4. ./src/IO/HTTPCommon.h:33: DB::HTTPException::HTTPException(int, String const&, Poco::Net::HTTPResponse::HTTPStatus, String const&, String const&) @ 0x000000001c500fa9 E 5. ./build_docker/./src/IO/HTTPCommon.cpp:93: DB::assertResponseIsOk(String const&, Poco::Net::HTTPResponse&, std::basic_istream>&, bool) @ 0x000000001c5008eb E 6. ./build_docker/./src/IO/ReadWriteBufferFromHTTP.cpp:277: DB::ReadWriteBufferFromHTTP::callImpl(Poco::Net::HTTPResponse&, String const&, std::optional const&, bool) const @ 0x0000000021207493 E 7. ./build_docker/./src/IO/ReadWriteBufferFromHTTP.cpp:285: DB::ReadWriteBufferFromHTTP::callWithRedirects(Poco::Net::HTTPResponse&, String const&, std::optional const&) @ 0x00000000212079dc E 8. ./build_docker/./src/IO/ReadWriteBufferFromHTTP.cpp:408: DB::ReadWriteBufferFromHTTP::initialize() @ 0x0000000021208a5b E 9. ./build_docker/./src/IO/ReadWriteBufferFromHTTP.cpp:472: void std::__function::__policy_invoker::__call_impl[abi:ne190107]>(std::__function::__policy_storage const*) @ 0x000000002120e378 E 10. ./contrib/llvm-project/libcxx/include/__functional/function.h:716: ? @ 0x00000000212033d1 E 11. ./build_docker/./src/IO/ReadWriteBufferFromHTTP.cpp:465: DB::ReadWriteBufferFromHTTP::nextImpl() @ 0x000000002120b083 E 12. DB::ReadBuffer::next() @ 0x000000000c5cc20b E 13. ./build_docker/./src/IO/ReadWriteBufferFromHTTP.cpp:254: DB::ReadWriteBufferFromHTTP::ReadWriteBufferFromHTTP(DB::HTTPConnectionGroupType const&, Poco::URI const&, String const&, DB::ProxyConfiguration, DB::ReadSettings, DB::ConnectionTimeouts, Poco::Net::HTTPBasicCredentials const&, DB::RemoteHostFilter const*, unsigned long, unsigned long, std::function>&)>, bool, bool, std::vector>, bool, std::optional) @ 0x0000000021206744 E 14. ./contrib/llvm-project/libcxx/include/__memory/unique_ptr.h:634: std::__unique_if::__unique_single std::make_unique[abi:ne190107]>&)>&, bool&, bool&, std::vector>&, bool&, std::nullopt_t const&>(DB::HTTPConnectionGroupType&, Poco::URI&, String&, DB::ProxyConfiguration&, DB::ReadSettings&, DB::ConnectionTimeouts&, Poco::Net::HTTPBasicCredentials const&, DB::RemoteHostFilter const*&, unsigned long&, unsigned long&, std::function>&)>&, bool&, bool&, std::vector>&, bool&, std::nullopt_t const&) @ 0x00000000211f1e28 E 15. ./src/IO/ReadWriteBufferFromHTTP.h:248: DataLake::createReadBuffer(String const&, std::shared_ptr, Poco::Net::HTTPBasicCredentials const&, std::vector, std::allocator>> const&, std::vector> const&, String const&, std::function>&)>) @ 0x000000002845c98a E 16. ./build_docker/./src/Databases/DataLake/HTTPBasedCatalogUtils.cpp:50: DataLake::makeHTTPRequestAndReadJSON(String const&, std::shared_ptr, Poco::Net::HTTPBasicCredentials const&, std::vector, std::allocator>> const&, std::vector> const&, String const&, std::function>&)>) @ 0x000000002845d135 E 17. ./build_docker/./src/Databases/DataLake/UnityCatalog.cpp:55: DataLake::UnityCatalog::getJSONRequest(String const&, std::vector, std::allocator>> const&) const @ 0x000000002844a1ae E 18. ./build_docker/./src/Databases/DataLake/UnityCatalog.cpp:146: DataLake::UnityCatalog::tryGetTableMetadata(String const&, String const&, DataLake::TableMetadata&) const @ 0x0000000028450cba E 19. ./build_docker/./src/Databases/DataLake/UnityCatalog.cpp:96: DataLake::UnityCatalog::getTableMetadata(String const&, String const&, DataLake::TableMetadata&) const @ 0x00000000284502b0 E 20. ./build_docker/./src/Databases/DataLake/DatabaseDataLake.cpp:484: DB::DatabaseDataLake::getCreateTableQueryImpl(String const&, std::shared_ptr, bool) const @ 0x0000000028406413 E 21. ./src/Databases/IDatabase.h:352: DB::InterpreterShowCreateQuery::executeImpl() @ 0x000000002ac8b34f E 22. ./build_docker/./src/Interpreters/InterpreterShowCreateQuery.cpp:34: DB::InterpreterShowCreateQuery::execute() @ 0x000000002ac8a9df E 23. ./build_docker/./src/Interpreters/executeQuery.cpp:1457: DB::executeQueryImpl(char const*, char const*, std::shared_ptr, DB::QueryFlags, DB::QueryProcessingStage::Enum, DB::ReadBuffer*, std::shared_ptr&) @ 0x000000002abdbfa3 E 24. ./build_docker/./src/Interpreters/executeQuery.cpp:1624: DB::executeQuery(String const&, std::shared_ptr, DB::QueryFlags, DB::QueryProcessingStage::Enum) @ 0x000000002abd3ee5 E 25. ./build_docker/./src/Server/TCPHandler.cpp:664: DB::TCPHandler::runImpl() @ 0x000000002fd40c16 E 26. ./build_docker/./src/Server/TCPHandler.cpp:2629: DB::TCPHandler::run() @ 0x000000002fd808da E 27. ./build_docker/./base/poco/Net/src/TCPServerConnection.cpp:40: Poco::Net::TCPServerConnection::start() @ 0x00000000384c034f E 28. ./build_docker/./base/poco/Net/src/TCPServerDispatcher.cpp:115: Poco::Net::TCPServerDispatcher::run() @ 0x00000000384c1017 E 29. ./build_docker/./base/poco/Foundation/src/ThreadPool.cpp:205: Poco::PooledThread::run() @ 0x00000000383cf96b E 30. ./base/poco/Foundation/src/Thread_POSIX.cpp:335: Poco::ThreadImpl::runnableEntry(void*) @ 0x00000000383c9568 E 31. asan_thread_start(void*) @ 0x000000000c340e77 E . (RECEIVED_ERROR_FROM_REMOTE_IO_SERVER) E (query: SHOW CREATE TABLE complex_schema.`schema_with_complex_tables.complex_table`) helpers/client.py:248: QueryRuntimeException ---------------------------- Captured stdout setup ----------------------------- Copy common default production configuration from /clickhouse-config. Files: config.xml, users.xml ------------------------------ Captured log setup ------------------------------ 2025-07-28 09:44:26.306000 [ 653 ] DEBUG : Command:[docker ps | wc -l] (cluster.py:121, run_and_check) 2025-07-28 09:44:26.338000 [ 653 ] DEBUG : Stdout:1 (cluster.py:145, run_and_check) 2025-07-28 09:44:26.338000 [ 653 ] DEBUG : No running containers (conftest.py:95, cleanup_environment) 2025-07-28 09:44:26.338000 [ 653 ] DEBUG : Pruning Docker networks (conftest.py:97, cleanup_environment) 2025-07-28 09:44:26.338000 [ 653 ] DEBUG : Command:[docker network prune --force] (cluster.py:121, run_and_check) 2025-07-28 09:44:26.368000 [ 653 ] DEBUG : Command:[sysctl net.ipv4.ip_local_port_range='55000 65535'] (cluster.py:121, run_and_check) 2025-07-28 09:44:26.373000 [ 653 ] DEBUG : Stdout:net.ipv4.ip_local_port_range = 55000 65535 (cluster.py:145, run_and_check) 2025-07-28 09:44:26.373000 [ 653 ] DEBUG : ENV DOCKER_KERBEROS_KDC_TAG 9391ecdee8d7 (cluster.py:424, __init__) 2025-07-28 09:44:26.373000 [ 653 ] DEBUG : ENV CLICKHOUSE_TESTS_SERVER_BIN_PATH /clickhouse (cluster.py:424, __init__) 2025-07-28 09:44:26.374000 [ 653 ] DEBUG : ENV MSAN_OPTIONS abort_on_error=1 poison_in_dtor=1 (cluster.py:424, __init__) 2025-07-28 09:44:26.374000 [ 653 ] DEBUG : ENV JAVA_TOOL_OPTIONS -Djdk.attach.allowAttachSelf=true (cluster.py:424, __init__) 2025-07-28 09:44:26.374000 [ 653 ] DEBUG : ENV TSAN_OPTIONS halt_on_error=1 abort_on_error=1 history_size=7 memory_limit_mb=46080 second_deadlock_stack=1 (cluster.py:424, __init__) 2025-07-28 09:44:26.374000 [ 653 ] DEBUG : ENV HOSTNAME 94d9bd3b1e6b (cluster.py:424, __init__) 2025-07-28 09:44:26.374000 [ 653 ] DEBUG : ENV SHLVL 0 (cluster.py:424, __init__) 2025-07-28 09:44:26.374000 [ 653 ] DEBUG : ENV HOME /root (cluster.py:424, __init__) 2025-07-28 09:44:26.374000 [ 653 ] DEBUG : ENV OLDPWD / (cluster.py:424, __init__) 2025-07-28 09:44:26.374000 [ 653 ] DEBUG : ENV DOCKER_HELPER_TAG 5dc43a6382f0 (cluster.py:424, __init__) 2025-07-28 09:44:26.374000 [ 653 ] DEBUG : ENV PYTHONUNBUFFERED 1 (cluster.py:424, __init__) 2025-07-28 09:44:26.375000 [ 653 ] DEBUG : ENV DOCKER_PYTHON_BOTTLE_TAG d862517635bf (cluster.py:424, __init__) 2025-07-28 09:44:26.375000 [ 653 ] DEBUG : ENV UBSAN_OPTIONS print_stacktrace=1 (cluster.py:424, __init__) 2025-07-28 09:44:26.375000 [ 653 ] DEBUG : ENV PYTEST_ADDOPTS --dist=loadfile -n 10 -rfEps --run-id=1 --color=no --durations=0 --report-log=parallel0_1.jsonl --report-log-exclude-logs-on-passed-tests test_database_delta/test.py::test_complex_table_schema test_database_delta/test.py::test_embedded_database_and_tables test_database_delta/test.py::test_multiple_schemes_tables -vvv (cluster.py:424, __init__) 2025-07-28 09:44:26.375000 [ 653 ] DEBUG : ENV COMPOSE_HTTP_TIMEOUT 600 (cluster.py:424, __init__) 2025-07-28 09:44:26.375000 [ 653 ] DEBUG : ENV DOCKER_MYSQL_PHP_CLIENT_TAG 88be89c1e3b6 (cluster.py:424, __init__) 2025-07-28 09:44:26.375000 [ 653 ] DEBUG : ENV DOCKER_DOTNET_CLIENT_TAG 11de0b29a15d (cluster.py:424, __init__) 2025-07-28 09:44:26.375000 [ 653 ] DEBUG : ENV CLICKHOUSE_TESTS_CLIENT_BIN_PATH /clickhouse (cluster.py:424, __init__) 2025-07-28 09:44:26.375000 [ 653 ] DEBUG : ENV DOCKER_MYSQL_JS_CLIENT_TAG 41ba7c2ec2a1 (cluster.py:424, __init__) 2025-07-28 09:44:26.376000 [ 653 ] DEBUG : ENV CLICKHOUSE_USE_OLD_ANALYZER 1 (cluster.py:424, __init__) 2025-07-28 09:44:26.376000 [ 653 ] DEBUG : ENV PATH /spark-3.3.2-bin-hadoop3/bin:/opt/gdb/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin (cluster.py:424, __init__) 2025-07-28 09:44:26.376000 [ 653 ] DEBUG : ENV DOCKER_KERBERIZED_HADOOP_TAG latest (cluster.py:424, __init__) 2025-07-28 09:44:26.376000 [ 653 ] DEBUG : ENV DOCKER_CHANNEL stable (cluster.py:424, __init__) 2025-07-28 09:44:26.376000 [ 653 ] DEBUG : ENV DOCKER_CLIENT_TIMEOUT 300 (cluster.py:424, __init__) 2025-07-28 09:44:26.376000 [ 653 ] DEBUG : ENV DOCKER_POSTGRESQL_JAVA_CLIENT_TAG a4eff5c7f4d6 (cluster.py:424, __init__) 2025-07-28 09:44:26.376000 [ 653 ] DEBUG : ENV DOCKER_NGINX_DAV_TAG b55ac9cd7519 (cluster.py:424, __init__) 2025-07-28 09:44:26.376000 [ 653 ] DEBUG : ENV DOCKER_MYSQL_GOLANG_CLIENT_TAG 9bec2a638e6e (cluster.py:424, __init__) 2025-07-28 09:44:26.377000 [ 653 ] DEBUG : ENV PWD /ClickHouse/tests/integration (cluster.py:424, __init__) 2025-07-28 09:44:26.377000 [ 653 ] DEBUG : ENV DOCKER_MYSQL_JAVA_CLIENT_TAG 766bff31cfe4 (cluster.py:424, __init__) 2025-07-28 09:44:26.377000 [ 653 ] DEBUG : ENV CLICKHOUSE_TESTS_BASE_CONFIG_DIR /clickhouse-config (cluster.py:424, __init__) 2025-07-28 09:44:26.377000 [ 653 ] DEBUG : ENV TZ Etc/UTC (cluster.py:424, __init__) 2025-07-28 09:44:26.377000 [ 653 ] DEBUG : ENV JAVA_PATH /usr/lib/jvm/java-11-openjdk-amd64/bin/java (cluster.py:424, __init__) 2025-07-28 09:44:26.377000 [ 653 ] DEBUG : ENV DOCKER_BASE_TAG 5ccda723c1fc (cluster.py:424, __init__) 2025-07-28 09:44:26.377000 [ 653 ] DEBUG : ENV SPARK_HOME /spark-3.3.2-bin-hadoop3 (cluster.py:424, __init__) 2025-07-28 09:44:26.377000 [ 653 ] DEBUG : ENV LC_CTYPE C.UTF-8 (cluster.py:424, __init__) 2025-07-28 09:44:26.377000 [ 653 ] DEBUG : ENV INTEGRATION_TESTS_RUN_ID 1 (cluster.py:424, __init__) 2025-07-28 09:44:26.378000 [ 653 ] DEBUG : ENV WORKER_FREE_PORTS 30050 30051 30052 30053 30054 30055 30056 30057 30058 30059 30060 30061 30062 30063 30064 30065 30066 30067 30068 30069 30070 30071 30072 30073 30074 30075 30076 30077 30078 30079 30080 30081 30082 30083 30084 30085 30086 30087 30088 30089 30090 30091 30092 30093 30094 30095 30096 30097 30098 30099 (cluster.py:424, __init__) 2025-07-28 09:44:26.378000 [ 653 ] DEBUG : ENV PYTEST_XDIST_TESTRUNUID 02cf50870f2045979e741375c1462189 (cluster.py:424, __init__) 2025-07-28 09:44:26.378000 [ 653 ] DEBUG : ENV PYTEST_XDIST_WORKER gw1 (cluster.py:424, __init__) 2025-07-28 09:44:26.378000 [ 653 ] DEBUG : ENV PYTEST_XDIST_WORKER_COUNT 10 (cluster.py:424, __init__) 2025-07-28 09:44:26.378000 [ 653 ] DEBUG : ENV PYTEST_CURRENT_TEST test_database_delta/test.py::test_complex_table_schema (setup) (cluster.py:424, __init__) 2025-07-28 09:44:26.379000 [ 653 ] DEBUG : CLUSTER INIT base_config_dir:/clickhouse-config (cluster.py:724, __init__) 2025-07-28 09:44:26.379000 [ 653 ] DEBUG : clickhouse_start_command: clickhouse server --config-file=/etc/clickhouse-server/{main_config_file} --log-file=/var/log/clickhouse-server/clickhouse-server.log --errorlog-file=/var/log/clickhouse-server/clickhouse-server.err.log (cluster.py:1662, add_instance) 2025-07-28 09:44:26.380000 [ 653 ] DEBUG : Cluster name: project_name:roottestdatabasedelta-gw1. Added instance name:node1 tag:latest base_cmd:['docker', 'compose', '--env-file', '/ClickHouse/tests/integration/test_database_delta/_instances-1-gw1/.env', '--project-name', 'roottestdatabasedelta-gw1', '--file', '/ClickHouse/tests/integration/test_database_delta/_instances-1-gw1/node1/docker-compose.yml'] docker_compose_yml_dir:/ClickHouse/tests/integration/helpers/../../../tests/integration/compose/ (cluster.py:1948, add_instance) 2025-07-28 09:44:26.380000 [ 653 ] INFO : Starting cluster... (test.py:42, started_cluster) 2025-07-28 09:44:26.380000 [ 653 ] INFO : Running tests in /ClickHouse/tests/integration/test_database_delta/test.py (cluster.py:2738, start) 2025-07-28 09:44:26.380000 [ 653 ] DEBUG : Cluster start called. is_up=False (cluster.py:2745, start) 2025-07-28 09:44:26.410000 [ 653 ] DEBUG : Docker networks for project roottestdatabasedelta-gw1 are NETWORK ID NAME DRIVER SCOPE (cluster.py:830, print_all_docker_pieces) 2025-07-28 09:44:26.434000 [ 653 ] DEBUG : Docker containers for project roottestdatabasedelta-gw1 are CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES (cluster.py:838, print_all_docker_pieces) 2025-07-28 09:44:26.462000 [ 653 ] DEBUG : Docker volumes for project roottestdatabasedelta-gw1 are DRIVER VOLUME NAME (cluster.py:846, print_all_docker_pieces) 2025-07-28 09:44:26.462000 [ 653 ] DEBUG : Cleanup called (cluster.py:851, cleanup) 2025-07-28 09:44:26.490000 [ 653 ] DEBUG : Docker networks for project roottestdatabasedelta-gw1 are NETWORK ID NAME DRIVER SCOPE (cluster.py:830, print_all_docker_pieces) 2025-07-28 09:44:26.520000 [ 653 ] DEBUG : Docker containers for project roottestdatabasedelta-gw1 are CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES (cluster.py:838, print_all_docker_pieces) 2025-07-28 09:44:26.550000 [ 653 ] DEBUG : Docker volumes for project roottestdatabasedelta-gw1 are DRIVER VOLUME NAME (cluster.py:846, print_all_docker_pieces) 2025-07-28 09:44:26.550000 [ 653 ] DEBUG : Command:[docker container list --all --filter name='^/roottestdatabasedelta-gw1-.*-1$' --format '{{.ID}}:{{.Names}}'] (cluster.py:121, run_and_check) 2025-07-28 09:44:26.580000 [ 653 ] DEBUG : Unstopped containers: {} (cluster.py:865, cleanup) 2025-07-28 09:44:26.581000 [ 653 ] DEBUG : No running containers for project: roottestdatabasedelta-gw1 (cluster.py:879, cleanup) 2025-07-28 09:44:26.581000 [ 653 ] DEBUG : Trying to prune unused networks... (cluster.py:885, cleanup) 2025-07-28 09:44:26.608000 [ 653 ] DEBUG : Trying to prune unused images... (cluster.py:901, cleanup) 2025-07-28 09:44:26.608000 [ 653 ] DEBUG : Command:[docker image prune -f] (cluster.py:121, run_and_check) 2025-07-28 09:44:26.651000 [ 653 ] DEBUG : Stdout:Total reclaimed space: 0B (cluster.py:145, run_and_check) 2025-07-28 09:44:26.651000 [ 653 ] DEBUG : Images pruned (cluster.py:904, cleanup) 2025-07-28 09:44:26.652000 [ 653 ] DEBUG : Trying to prune unused volumes... (cluster.py:910, cleanup) 2025-07-28 09:44:26.652000 [ 653 ] DEBUG : Command:[docker volume ls | wc -l] (cluster.py:121, run_and_check) 2025-07-28 09:44:26.681000 [ 653 ] DEBUG : Stdout:1 (cluster.py:145, run_and_check) 2025-07-28 09:44:26.682000 [ 653 ] DEBUG : Volumes pruned: 1 (cluster.py:915, cleanup) 2025-07-28 09:44:26.682000 [ 653 ] DEBUG : Setup directory for instance: node1 (cluster.py:2758, start) 2025-07-28 09:44:26.683000 [ 653 ] DEBUG : Create directory for configuration generated in this helper (cluster.py:4628, create_dir) 2025-07-28 09:44:26.684000 [ 653 ] DEBUG : Create directory for common tests configuration (cluster.py:4633, create_dir) 2025-07-28 09:44:26.684000 [ 653 ] DEBUG : Copy common configuration from helpers (cluster.py:4653, create_dir) 2025-07-28 09:44:26.685000 [ 653 ] DEBUG : Generate and write macros file (cluster.py:4705, create_dir) 2025-07-28 09:44:26.685000 [ 653 ] DEBUG : Copy custom test config files [] to /ClickHouse/tests/integration/test_database_delta/_instances-1-gw1/node1/configs/config.d (cluster.py:4741, create_dir) 2025-07-28 09:44:26.685000 [ 653 ] DEBUG : Setup database dir /ClickHouse/tests/integration/test_database_delta/_instances-1-gw1/node1/database (cluster.py:4758, create_dir) 2025-07-28 09:44:26.686000 [ 653 ] DEBUG : Setup logs dir /ClickHouse/tests/integration/test_database_delta/_instances-1-gw1/node1/logs (cluster.py:4769, create_dir) 2025-07-28 09:44:26.686000 [ 653 ] DEBUG : Entrypoint cmd: ["clickhouse", "server", "--config-file=/etc/clickhouse-server/config.xml", "--log-file=/var/log/clickhouse-server/clickhouse-server.log", "--errorlog-file=/var/log/clickhouse-server/clickhouse-server.err.log", "--"] (cluster.py:4850, create_dir) 2025-07-28 09:44:26.686000 [ 653 ] DEBUG : Env {'ASAN_OPTIONS': 'use_sigaltstack=0', 'TSAN_OPTIONS': 'use_sigaltstack=0', 'CLICKHOUSE_WATCHDOG_ENABLE': '0', 'CLICKHOUSE_NATS_TLS_SECURE': '0', 'LLVM_PROFILE_FILE': '/var/lib/clickhouse/server_%h_%p_%m.profraw'} stored in /ClickHouse/tests/integration/test_database_delta/_instances-1-gw1/.env (cluster.py:96, _create_env_file) 2025-07-28 09:44:26.687000 [ 653 ] DEBUG : Trying paths: ['/root/.docker/config.json', '/root/.dockercfg'] (config.py:21, find_config_file) 2025-07-28 09:44:26.687000 [ 653 ] DEBUG : No config file found (config.py:28, find_config_file) 2025-07-28 09:44:26.687000 [ 653 ] DEBUG : Trying paths: ['/root/.docker/config.json', '/root/.dockercfg'] (config.py:21, find_config_file) 2025-07-28 09:44:26.687000 [ 653 ] DEBUG : No config file found (config.py:28, find_config_file) 2025-07-28 09:44:26.700000 [ 653 ] DEBUG : http://localhost:None "GET /version HTTP/1.1" 200 826 (connectionpool.py:547, _make_request) 2025-07-28 09:44:26.701000 [ 653 ] DEBUG : Command:[docker compose --env-file /ClickHouse/tests/integration/test_database_delta/_instances-1-gw1/.env --project-name roottestdatabasedelta-gw1 --file /ClickHouse/tests/integration/test_database_delta/_instances-1-gw1/node1/docker-compose.yml pull] (cluster.py:121, run_and_check) 2025-07-28 09:44:37.162000 [ 653 ] DEBUG : Stderr: node1 Pulling (cluster.py:147, run_and_check) 2025-07-28 09:44:37.162000 [ 653 ] DEBUG : Stderr: node1 Pulled (cluster.py:147, run_and_check) 2025-07-28 09:44:37.162000 [ 653 ] DEBUG : ('Trying to create ClickHouse instance by command %s', 'docker compose --env-file /ClickHouse/tests/integration/test_database_delta/_instances-1-gw1/.env --project-name roottestdatabasedelta-gw1 --file /ClickHouse/tests/integration/test_database_delta/_instances-1-gw1/node1/docker-compose.yml up -d --no-recreate') (cluster.py:3139, start) 2025-07-28 09:44:37.163000 [ 653 ] DEBUG : Command:[docker compose --env-file /ClickHouse/tests/integration/test_database_delta/_instances-1-gw1/.env --project-name roottestdatabasedelta-gw1 --file /ClickHouse/tests/integration/test_database_delta/_instances-1-gw1/node1/docker-compose.yml up -d --no-recreate] (cluster.py:121, run_and_check) 2025-07-28 09:44:37.831000 [ 653 ] DEBUG : Stderr: Network roottestdatabasedelta-gw1_default Creating (cluster.py:147, run_and_check) 2025-07-28 09:44:37.831000 [ 653 ] DEBUG : Stderr: Network roottestdatabasedelta-gw1_default Created (cluster.py:147, run_and_check) 2025-07-28 09:44:37.831000 [ 653 ] DEBUG : Stderr: Container roottestdatabasedelta-gw1-node1-1 Creating (cluster.py:147, run_and_check) 2025-07-28 09:44:37.832000 [ 653 ] DEBUG : Stderr: Container roottestdatabasedelta-gw1-node1-1 Created (cluster.py:147, run_and_check) 2025-07-28 09:44:37.832000 [ 653 ] DEBUG : Stderr: Container roottestdatabasedelta-gw1-node1-1 Starting (cluster.py:147, run_and_check) 2025-07-28 09:44:37.832000 [ 653 ] DEBUG : Stderr: Container roottestdatabasedelta-gw1-node1-1 Started (cluster.py:147, run_and_check) 2025-07-28 09:44:37.832000 [ 653 ] DEBUG : ClickHouse instance created (cluster.py:3147, start) 2025-07-28 09:44:37.832000 [ 653 ] DEBUG : get_instance_ip instance_name=node1 (cluster.py:2005, get_instance_ip) 2025-07-28 09:44:37.838000 [ 653 ] DEBUG : http://localhost:None "GET /v1.46/containers/roottestdatabasedelta-gw1-node1-1/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-07-28 09:44:37.839000 [ 653 ] DEBUG : get_instance_ip instance_name=node1 (cluster.py:2015, get_instance_global_ipv6) 2025-07-28 09:44:37.842000 [ 653 ] DEBUG : http://localhost:None "GET /v1.46/containers/roottestdatabasedelta-gw1-node1-1/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-07-28 09:44:37.842000 [ 653 ] DEBUG : Waiting for ClickHouse start in node1, ip: 172.16.1.2... (cluster.py:3155, start) 2025-07-28 09:44:37.845000 [ 653 ] DEBUG : http://localhost:None "GET /v1.46/containers/roottestdatabasedelta-gw1-node1-1/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-07-28 09:44:37.849000 [ 653 ] DEBUG : http://localhost:None "GET /v1.46/containers/11d02813290cc3f08b9780022d7a5a4640956a69d906df5bbc6f2dc6262e9540/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-07-28 09:44:37.953000 [ 653 ] DEBUG : http://localhost:None "GET /v1.46/containers/11d02813290cc3f08b9780022d7a5a4640956a69d906df5bbc6f2dc6262e9540/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-07-28 09:44:38.058000 [ 653 ] DEBUG : http://localhost:None "GET /v1.46/containers/11d02813290cc3f08b9780022d7a5a4640956a69d906df5bbc6f2dc6262e9540/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-07-28 09:44:38.162000 [ 653 ] DEBUG : http://localhost:None "GET /v1.46/containers/11d02813290cc3f08b9780022d7a5a4640956a69d906df5bbc6f2dc6262e9540/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-07-28 09:44:38.267000 [ 653 ] DEBUG : http://localhost:None "GET /v1.46/containers/11d02813290cc3f08b9780022d7a5a4640956a69d906df5bbc6f2dc6262e9540/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-07-28 09:44:38.373000 [ 653 ] DEBUG : http://localhost:None "GET /v1.46/containers/11d02813290cc3f08b9780022d7a5a4640956a69d906df5bbc6f2dc6262e9540/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-07-28 09:44:38.478000 [ 653 ] DEBUG : http://localhost:None "GET /v1.46/containers/11d02813290cc3f08b9780022d7a5a4640956a69d906df5bbc6f2dc6262e9540/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-07-28 09:44:38.582000 [ 653 ] DEBUG : http://localhost:None "GET /v1.46/containers/11d02813290cc3f08b9780022d7a5a4640956a69d906df5bbc6f2dc6262e9540/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-07-28 09:44:38.687000 [ 653 ] DEBUG : http://localhost:None "GET /v1.46/containers/11d02813290cc3f08b9780022d7a5a4640956a69d906df5bbc6f2dc6262e9540/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-07-28 09:44:38.792000 [ 653 ] DEBUG : http://localhost:None "GET /v1.46/containers/11d02813290cc3f08b9780022d7a5a4640956a69d906df5bbc6f2dc6262e9540/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-07-28 09:44:38.897000 [ 653 ] DEBUG : http://localhost:None "GET /v1.46/containers/11d02813290cc3f08b9780022d7a5a4640956a69d906df5bbc6f2dc6262e9540/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-07-28 09:44:39.001000 [ 653 ] DEBUG : http://localhost:None "GET /v1.46/containers/11d02813290cc3f08b9780022d7a5a4640956a69d906df5bbc6f2dc6262e9540/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-07-28 09:44:39.105000 [ 653 ] DEBUG : http://localhost:None "GET /v1.46/containers/11d02813290cc3f08b9780022d7a5a4640956a69d906df5bbc6f2dc6262e9540/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-07-28 09:44:39.210000 [ 653 ] DEBUG : http://localhost:None "GET /v1.46/containers/11d02813290cc3f08b9780022d7a5a4640956a69d906df5bbc6f2dc6262e9540/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-07-28 09:44:39.314000 [ 653 ] DEBUG : http://localhost:None "GET /v1.46/containers/11d02813290cc3f08b9780022d7a5a4640956a69d906df5bbc6f2dc6262e9540/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-07-28 09:44:39.418000 [ 653 ] DEBUG : http://localhost:None "GET /v1.46/containers/11d02813290cc3f08b9780022d7a5a4640956a69d906df5bbc6f2dc6262e9540/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-07-28 09:44:39.522000 [ 653 ] DEBUG : http://localhost:None "GET /v1.46/containers/11d02813290cc3f08b9780022d7a5a4640956a69d906df5bbc6f2dc6262e9540/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-07-28 09:44:39.627000 [ 653 ] DEBUG : http://localhost:None "GET /v1.46/containers/11d02813290cc3f08b9780022d7a5a4640956a69d906df5bbc6f2dc6262e9540/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-07-28 09:44:39.629000 [ 653 ] DEBUG : ClickHouse node1 started (cluster.py:3159, start) 2025-07-28 09:44:39.630000 [ 653 ] DEBUG : run container_id:roottestdatabasedelta-gw1-node1-1 detach:False nothrow:False cmd: ['bash', '-c', 'cd /unitycatalog && nohup bin/start-uc-server &'] (cluster.py:2051, exec_in_container) 2025-07-28 09:44:39.630000 [ 653 ] DEBUG : Command:[docker exec roottestdatabasedelta-gw1-node1-1 bash -c cd /unitycatalog && nohup bin/start-uc-server &] (cluster.py:121, run_and_check) ------------------------------ Captured log call ------------------------------- 2025-07-28 09:44:41.702000 [ 653 ] DEBUG : run container_id:roottestdatabasedelta-gw1-node1-1 detach:False nothrow:True cmd: ['bash', '-c', '\ncd /spark-3.5.4-bin-hadoop3 && bin/spark-sql --name "s3-uc-test" \\\n --master "local[*]" \\\n --packages "org.apache.hadoop:hadoop-aws:3.3.4,io.delta:delta-spark_2.12:3.2.1,io.unitycatalog:unitycatalog-spark_2.12:0.2.0" \\\n --conf "spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension" \\\n --conf "spark.sql.catalog.spark_catalog=io.unitycatalog.spark.UCSingleCatalog" \\\n --conf "spark.hadoop.fs.s3.impl=org.apache.hadoop.fs.s3a.S3AFileSystem" \\\n --conf "spark.sql.catalog.unity=io.unitycatalog.spark.UCSingleCatalog" \\\n --conf "spark.sql.catalog.unity.uri=http://localhost:8080" \\\n --conf "spark.sql.catalog.unity.token=" \\\n --conf "spark.sql.defaultCatalog=unity" \\\n -S -e "CREATE SCHEMA schema_with_complex_tables" | grep -v \'loading settings\'\n'] (cluster.py:2051, exec_in_container) 2025-07-28 09:44:41.703000 [ 653 ] DEBUG : Command:[docker exec roottestdatabasedelta-gw1-node1-1 bash -c cd /spark-3.5.4-bin-hadoop3 && bin/spark-sql --name "s3-uc-test" \ --master "local[*]" \ --packages "org.apache.hadoop:hadoop-aws:3.3.4,io.delta:delta-spark_2.12:3.2.1,io.unitycatalog:unitycatalog-spark_2.12:0.2.0" \ --conf "spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension" \ --conf "spark.sql.catalog.spark_catalog=io.unitycatalog.spark.UCSingleCatalog" \ --conf "spark.hadoop.fs.s3.impl=org.apache.hadoop.fs.s3a.S3AFileSystem" \ --conf "spark.sql.catalog.unity=io.unitycatalog.spark.UCSingleCatalog" \ --conf "spark.sql.catalog.unity.uri=http://localhost:8080" \ --conf "spark.sql.catalog.unity.token=" \ --conf "spark.sql.defaultCatalog=unity" \ -S -e "CREATE SCHEMA schema_with_complex_tables" | grep -v 'loading settings' ] (cluster.py:121, run_and_check) 2025-07-28 09:44:48.004000 [ 653 ] DEBUG : Stderr:Ivy Default Cache set to: /root/.ivy2/cache (cluster.py:147, run_and_check) 2025-07-28 09:44:48.004000 [ 653 ] DEBUG : Stderr:The jars for the packages stored in: /root/.ivy2/jars (cluster.py:147, run_and_check) 2025-07-28 09:44:48.004000 [ 653 ] DEBUG : Stderr:org.apache.hadoop#hadoop-aws added as a dependency (cluster.py:147, run_and_check) 2025-07-28 09:44:48.005000 [ 653 ] DEBUG : Stderr:io.delta#delta-spark_2.12 added as a dependency (cluster.py:147, run_and_check) 2025-07-28 09:44:48.005000 [ 653 ] DEBUG : Stderr:io.unitycatalog#unitycatalog-spark_2.12 added as a dependency (cluster.py:147, run_and_check) 2025-07-28 09:44:48.005000 [ 653 ] DEBUG : Stderr::: resolving dependencies :: org.apache.spark#spark-submit-parent-23a2a673-a5aa-4631-aa56-31c7579c534a;1.0 (cluster.py:147, run_and_check) 2025-07-28 09:44:48.005000 [ 653 ] DEBUG : Stderr: confs: [default] (cluster.py:147, run_and_check) 2025-07-28 09:44:48.005000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:44:48.005000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:44:48.005000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:44:48.005000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:44:48.006000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:44:48.006000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:44:48.006000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:44:48.006000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:44:48.006000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:44:48.006000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:44:48.006000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:44:48.006000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:44:48.007000 [ 653 ] DEBUG : Stderr::: resolution report :: resolve 4340ms :: artifacts dl 1ms (cluster.py:147, run_and_check) 2025-07-28 09:44:48.007000 [ 653 ] DEBUG : Stderr: :: modules in use: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.007000 [ 653 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-07-28 09:44:48.007000 [ 653 ] DEBUG : Stderr: | | modules || artifacts | (cluster.py:147, run_and_check) 2025-07-28 09:44:48.007000 [ 653 ] DEBUG : Stderr: | conf | number| search|dwnlded|evicted|| number|dwnlded| (cluster.py:147, run_and_check) 2025-07-28 09:44:48.007000 [ 653 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-07-28 09:44:48.007000 [ 653 ] DEBUG : Stderr: | default | 3 | 0 | 0 | 0 || 0 | 0 | (cluster.py:147, run_and_check) 2025-07-28 09:44:48.007000 [ 653 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-07-28 09:44:48.008000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.008000 [ 653 ] DEBUG : Stderr::: problems summary :: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.008000 [ 653 ] DEBUG : Stderr::::: WARNINGS (cluster.py:147, run_and_check) 2025-07-28 09:44:48.008000 [ 653 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-07-28 09:44:48.008000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.008000 [ 653 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-07-28 09:44:48.008000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.008000 [ 653 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-07-28 09:44:48.008000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.009000 [ 653 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-07-28 09:44:48.009000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.009000 [ 653 ] DEBUG : Stderr: module not found: org.apache.hadoop#hadoop-aws;3.3.4 (cluster.py:147, run_and_check) 2025-07-28 09:44:48.009000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.009000 [ 653 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-07-28 09:44:48.009000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.009000 [ 653 ] DEBUG : Stderr: file:/root/.m2/repository/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-07-28 09:44:48.009000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.010000 [ 653 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.010000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.010000 [ 653 ] DEBUG : Stderr: file:/root/.m2/repository/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-07-28 09:44:48.010000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.010000 [ 653 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-07-28 09:44:48.010000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.010000 [ 653 ] DEBUG : Stderr: /root/.ivy2/local/org.apache.hadoop/hadoop-aws/3.3.4/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-07-28 09:44:48.010000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.010000 [ 653 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.011000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.011000 [ 653 ] DEBUG : Stderr: /root/.ivy2/local/org.apache.hadoop/hadoop-aws/3.3.4/jars/hadoop-aws.jar (cluster.py:147, run_and_check) 2025-07-28 09:44:48.011000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.011000 [ 653 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-07-28 09:44:48.011000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.011000 [ 653 ] DEBUG : Stderr: https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-07-28 09:44:48.011000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.011000 [ 653 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.012000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.012000 [ 653 ] DEBUG : Stderr: https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-07-28 09:44:48.012000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.012000 [ 653 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-07-28 09:44:48.012000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.012000 [ 653 ] DEBUG : Stderr: https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-07-28 09:44:48.013000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.013000 [ 653 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.013000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.013000 [ 653 ] DEBUG : Stderr: https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-07-28 09:44:48.013000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.013000 [ 653 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-07-28 09:44:48.013000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.014000 [ 653 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-07-28 09:44:48.014000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.014000 [ 653 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-07-28 09:44:48.014000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.014000 [ 653 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-07-28 09:44:48.014000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.014000 [ 653 ] DEBUG : Stderr: module not found: io.delta#delta-spark_2.12;3.2.1 (cluster.py:147, run_and_check) 2025-07-28 09:44:48.015000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.015000 [ 653 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-07-28 09:44:48.015000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.015000 [ 653 ] DEBUG : Stderr: file:/root/.m2/repository/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-07-28 09:44:48.015000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.015000 [ 653 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.015000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.016000 [ 653 ] DEBUG : Stderr: file:/root/.m2/repository/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-07-28 09:44:48.016000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.016000 [ 653 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-07-28 09:44:48.016000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.016000 [ 653 ] DEBUG : Stderr: /root/.ivy2/local/io.delta/delta-spark_2.12/3.2.1/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-07-28 09:44:48.016000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.016000 [ 653 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.016000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.017000 [ 653 ] DEBUG : Stderr: /root/.ivy2/local/io.delta/delta-spark_2.12/3.2.1/jars/delta-spark_2.12.jar (cluster.py:147, run_and_check) 2025-07-28 09:44:48.017000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.017000 [ 653 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-07-28 09:44:48.017000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.017000 [ 653 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-07-28 09:44:48.017000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.017000 [ 653 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.017000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.018000 [ 653 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-07-28 09:44:48.018000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.018000 [ 653 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-07-28 09:44:48.018000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.018000 [ 653 ] DEBUG : Stderr: https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-07-28 09:44:48.018000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.018000 [ 653 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.018000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.019000 [ 653 ] DEBUG : Stderr: https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-07-28 09:44:48.019000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.019000 [ 653 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-07-28 09:44:48.019000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.019000 [ 653 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-07-28 09:44:48.019000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.020000 [ 653 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-07-28 09:44:48.020000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.020000 [ 653 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-07-28 09:44:48.020000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.020000 [ 653 ] DEBUG : Stderr: module not found: io.unitycatalog#unitycatalog-spark_2.12;0.2.0 (cluster.py:147, run_and_check) 2025-07-28 09:44:48.020000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.020000 [ 653 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-07-28 09:44:48.020000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.021000 [ 653 ] DEBUG : Stderr: file:/root/.m2/repository/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-07-28 09:44:48.021000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.021000 [ 653 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.021000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.021000 [ 653 ] DEBUG : Stderr: file:/root/.m2/repository/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-07-28 09:44:48.021000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.021000 [ 653 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-07-28 09:44:48.021000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.022000 [ 653 ] DEBUG : Stderr: /root/.ivy2/local/io.unitycatalog/unitycatalog-spark_2.12/0.2.0/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-07-28 09:44:48.022000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.022000 [ 653 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.022000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.022000 [ 653 ] DEBUG : Stderr: /root/.ivy2/local/io.unitycatalog/unitycatalog-spark_2.12/0.2.0/jars/unitycatalog-spark_2.12.jar (cluster.py:147, run_and_check) 2025-07-28 09:44:48.022000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.022000 [ 653 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-07-28 09:44:48.022000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.023000 [ 653 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-07-28 09:44:48.023000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.023000 [ 653 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.023000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.023000 [ 653 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-07-28 09:44:48.023000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.023000 [ 653 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-07-28 09:44:48.023000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.024000 [ 653 ] DEBUG : Stderr: https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-07-28 09:44:48.024000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.024000 [ 653 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.024000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.024000 [ 653 ] DEBUG : Stderr: https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-07-28 09:44:48.024000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.024000 [ 653 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.024000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.025000 [ 653 ] DEBUG : Stderr: :: UNRESOLVED DEPENDENCIES :: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.025000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.025000 [ 653 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.025000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.025000 [ 653 ] DEBUG : Stderr: :: org.apache.hadoop#hadoop-aws;3.3.4: not found (cluster.py:147, run_and_check) 2025-07-28 09:44:48.025000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.025000 [ 653 ] DEBUG : Stderr: :: io.delta#delta-spark_2.12;3.2.1: not found (cluster.py:147, run_and_check) 2025-07-28 09:44:48.025000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.026000 [ 653 ] DEBUG : Stderr: :: io.unitycatalog#unitycatalog-spark_2.12;0.2.0: not found (cluster.py:147, run_and_check) 2025-07-28 09:44:48.026000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.026000 [ 653 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.026000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.026000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.026000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:48.026000 [ 653 ] DEBUG : Stderr::: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS (cluster.py:147, run_and_check) 2025-07-28 09:44:48.026000 [ 653 ] DEBUG : Stderr:Exception in thread "main" java.lang.RuntimeException: [unresolved dependency: org.apache.hadoop#hadoop-aws;3.3.4: not found, unresolved dependency: io.delta#delta-spark_2.12;3.2.1: not found, unresolved dependency: io.unitycatalog#unitycatalog-spark_2.12;0.2.0: not found] (cluster.py:147, run_and_check) 2025-07-28 09:44:48.027000 [ 653 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1613) (cluster.py:147, run_and_check) 2025-07-28 09:44:48.027000 [ 653 ] DEBUG : Stderr: at org.apache.spark.util.DependencyUtils$.resolveMavenDependencies(DependencyUtils.scala:185) (cluster.py:147, run_and_check) 2025-07-28 09:44:48.027000 [ 653 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:339) (cluster.py:147, run_and_check) 2025-07-28 09:44:48.027000 [ 653 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:969) (cluster.py:147, run_and_check) 2025-07-28 09:44:48.027000 [ 653 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:199) (cluster.py:147, run_and_check) 2025-07-28 09:44:48.027000 [ 653 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:222) (cluster.py:147, run_and_check) 2025-07-28 09:44:48.027000 [ 653 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:91) (cluster.py:147, run_and_check) 2025-07-28 09:44:48.028000 [ 653 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1125) (cluster.py:147, run_and_check) 2025-07-28 09:44:48.028000 [ 653 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1134) (cluster.py:147, run_and_check) 2025-07-28 09:44:48.028000 [ 653 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) (cluster.py:147, run_and_check) 2025-07-28 09:44:48.028000 [ 653 ] DEBUG : Exitcode:1 (cluster.py:149, run_and_check) 2025-07-28 09:44:48.029000 [ 653 ] DEBUG : run container_id:roottestdatabasedelta-gw1-node1-1 detach:False nothrow:True cmd: ['bash', '-c', '\ncd /spark-3.5.4-bin-hadoop3 && bin/spark-sql --name "s3-uc-test" \\\n --master "local[*]" \\\n --packages "org.apache.hadoop:hadoop-aws:3.3.4,io.delta:delta-spark_2.12:3.2.1,io.unitycatalog:unitycatalog-spark_2.12:0.2.0" \\\n --conf "spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension" \\\n --conf "spark.sql.catalog.spark_catalog=io.unitycatalog.spark.UCSingleCatalog" \\\n --conf "spark.hadoop.fs.s3.impl=org.apache.hadoop.fs.s3a.S3AFileSystem" \\\n --conf "spark.sql.catalog.unity=io.unitycatalog.spark.UCSingleCatalog" \\\n --conf "spark.sql.catalog.unity.uri=http://localhost:8080" \\\n --conf "spark.sql.catalog.unity.token=" \\\n --conf "spark.sql.defaultCatalog=unity" \\\n -S -e "CREATE TABLE schema_with_complex_tables.complex_table (event_date DATE, event_time TIMESTAMP, hits ARRAY, ids MAP, really_complex STRUCT) using Delta location \'/tmp/complex_schema/complex_table\'" | grep -v \'loading settings\'\n'] (cluster.py:2051, exec_in_container) 2025-07-28 09:44:48.029000 [ 653 ] DEBUG : Command:[docker exec roottestdatabasedelta-gw1-node1-1 bash -c cd /spark-3.5.4-bin-hadoop3 && bin/spark-sql --name "s3-uc-test" \ --master "local[*]" \ --packages "org.apache.hadoop:hadoop-aws:3.3.4,io.delta:delta-spark_2.12:3.2.1,io.unitycatalog:unitycatalog-spark_2.12:0.2.0" \ --conf "spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension" \ --conf "spark.sql.catalog.spark_catalog=io.unitycatalog.spark.UCSingleCatalog" \ --conf "spark.hadoop.fs.s3.impl=org.apache.hadoop.fs.s3a.S3AFileSystem" \ --conf "spark.sql.catalog.unity=io.unitycatalog.spark.UCSingleCatalog" \ --conf "spark.sql.catalog.unity.uri=http://localhost:8080" \ --conf "spark.sql.catalog.unity.token=" \ --conf "spark.sql.defaultCatalog=unity" \ -S -e "CREATE TABLE schema_with_complex_tables.complex_table (event_date DATE, event_time TIMESTAMP, hits ARRAY, ids MAP, really_complex STRUCT) using Delta location '/tmp/complex_schema/complex_table'" | grep -v 'loading settings' ] (cluster.py:121, run_and_check) 2025-07-28 09:44:53.997000 [ 653 ] DEBUG : Stderr:Ivy Default Cache set to: /root/.ivy2/cache (cluster.py:147, run_and_check) 2025-07-28 09:44:53.997000 [ 653 ] DEBUG : Stderr:The jars for the packages stored in: /root/.ivy2/jars (cluster.py:147, run_and_check) 2025-07-28 09:44:53.998000 [ 653 ] DEBUG : Stderr:org.apache.hadoop#hadoop-aws added as a dependency (cluster.py:147, run_and_check) 2025-07-28 09:44:53.998000 [ 653 ] DEBUG : Stderr:io.delta#delta-spark_2.12 added as a dependency (cluster.py:147, run_and_check) 2025-07-28 09:44:53.998000 [ 653 ] DEBUG : Stderr:io.unitycatalog#unitycatalog-spark_2.12 added as a dependency (cluster.py:147, run_and_check) 2025-07-28 09:44:53.998000 [ 653 ] DEBUG : Stderr::: resolving dependencies :: org.apache.spark#spark-submit-parent-8ddbe812-d599-4d79-98d6-c3728cdf9ea4;1.0 (cluster.py:147, run_and_check) 2025-07-28 09:44:53.998000 [ 653 ] DEBUG : Stderr: confs: [default] (cluster.py:147, run_and_check) 2025-07-28 09:44:53.998000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:44:53.999000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:44:53.999000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:44:53.999000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:44:53.999000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:44:53.999000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:44:53.999000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:44:53.999000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:44:54.000000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:44:54.000000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:44:54.000000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:44:54.000000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:44:54.000000 [ 653 ] DEBUG : Stderr::: resolution report :: resolve 4275ms :: artifacts dl 0ms (cluster.py:147, run_and_check) 2025-07-28 09:44:54.000000 [ 653 ] DEBUG : Stderr: :: modules in use: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.000000 [ 653 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-07-28 09:44:54.000000 [ 653 ] DEBUG : Stderr: | | modules || artifacts | (cluster.py:147, run_and_check) 2025-07-28 09:44:54.001000 [ 653 ] DEBUG : Stderr: | conf | number| search|dwnlded|evicted|| number|dwnlded| (cluster.py:147, run_and_check) 2025-07-28 09:44:54.001000 [ 653 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-07-28 09:44:54.001000 [ 653 ] DEBUG : Stderr: | default | 3 | 0 | 0 | 0 || 0 | 0 | (cluster.py:147, run_and_check) 2025-07-28 09:44:54.001000 [ 653 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-07-28 09:44:54.001000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.001000 [ 653 ] DEBUG : Stderr::: problems summary :: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.001000 [ 653 ] DEBUG : Stderr::::: WARNINGS (cluster.py:147, run_and_check) 2025-07-28 09:44:54.001000 [ 653 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-07-28 09:44:54.001000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.002000 [ 653 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-07-28 09:44:54.002000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.002000 [ 653 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-07-28 09:44:54.002000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.002000 [ 653 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-07-28 09:44:54.002000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.002000 [ 653 ] DEBUG : Stderr: module not found: org.apache.hadoop#hadoop-aws;3.3.4 (cluster.py:147, run_and_check) 2025-07-28 09:44:54.002000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.002000 [ 653 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-07-28 09:44:54.003000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.003000 [ 653 ] DEBUG : Stderr: file:/root/.m2/repository/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-07-28 09:44:54.003000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.003000 [ 653 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.003000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.003000 [ 653 ] DEBUG : Stderr: file:/root/.m2/repository/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-07-28 09:44:54.003000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.004000 [ 653 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-07-28 09:44:54.004000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.004000 [ 653 ] DEBUG : Stderr: /root/.ivy2/local/org.apache.hadoop/hadoop-aws/3.3.4/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-07-28 09:44:54.004000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.004000 [ 653 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.004000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.004000 [ 653 ] DEBUG : Stderr: /root/.ivy2/local/org.apache.hadoop/hadoop-aws/3.3.4/jars/hadoop-aws.jar (cluster.py:147, run_and_check) 2025-07-28 09:44:54.004000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.004000 [ 653 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-07-28 09:44:54.005000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.005000 [ 653 ] DEBUG : Stderr: https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-07-28 09:44:54.005000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.005000 [ 653 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.005000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.005000 [ 653 ] DEBUG : Stderr: https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-07-28 09:44:54.005000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.005000 [ 653 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-07-28 09:44:54.005000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.006000 [ 653 ] DEBUG : Stderr: https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-07-28 09:44:54.006000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.006000 [ 653 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.006000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.006000 [ 653 ] DEBUG : Stderr: https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-07-28 09:44:54.006000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.006000 [ 653 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-07-28 09:44:54.006000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.006000 [ 653 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-07-28 09:44:54.007000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.007000 [ 653 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-07-28 09:44:54.007000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.007000 [ 653 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-07-28 09:44:54.007000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.007000 [ 653 ] DEBUG : Stderr: module not found: io.delta#delta-spark_2.12;3.2.1 (cluster.py:147, run_and_check) 2025-07-28 09:44:54.007000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.008000 [ 653 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-07-28 09:44:54.008000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.008000 [ 653 ] DEBUG : Stderr: file:/root/.m2/repository/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-07-28 09:44:54.008000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.008000 [ 653 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.008000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.008000 [ 653 ] DEBUG : Stderr: file:/root/.m2/repository/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-07-28 09:44:54.008000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.009000 [ 653 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-07-28 09:44:54.009000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.009000 [ 653 ] DEBUG : Stderr: /root/.ivy2/local/io.delta/delta-spark_2.12/3.2.1/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-07-28 09:44:54.009000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.009000 [ 653 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.009000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.009000 [ 653 ] DEBUG : Stderr: /root/.ivy2/local/io.delta/delta-spark_2.12/3.2.1/jars/delta-spark_2.12.jar (cluster.py:147, run_and_check) 2025-07-28 09:44:54.010000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.010000 [ 653 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-07-28 09:44:54.010000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.010000 [ 653 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-07-28 09:44:54.010000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.010000 [ 653 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.010000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.010000 [ 653 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-07-28 09:44:54.010000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.011000 [ 653 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-07-28 09:44:54.011000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.011000 [ 653 ] DEBUG : Stderr: https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-07-28 09:44:54.011000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.011000 [ 653 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.011000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.011000 [ 653 ] DEBUG : Stderr: https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-07-28 09:44:54.012000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.012000 [ 653 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-07-28 09:44:54.012000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.012000 [ 653 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-07-28 09:44:54.012000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.012000 [ 653 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-07-28 09:44:54.012000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.012000 [ 653 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-07-28 09:44:54.013000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.013000 [ 653 ] DEBUG : Stderr: module not found: io.unitycatalog#unitycatalog-spark_2.12;0.2.0 (cluster.py:147, run_and_check) 2025-07-28 09:44:54.013000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.013000 [ 653 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-07-28 09:44:54.013000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.013000 [ 653 ] DEBUG : Stderr: file:/root/.m2/repository/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-07-28 09:44:54.013000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.013000 [ 653 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.013000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.014000 [ 653 ] DEBUG : Stderr: file:/root/.m2/repository/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-07-28 09:44:54.014000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.014000 [ 653 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-07-28 09:44:54.014000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.014000 [ 653 ] DEBUG : Stderr: /root/.ivy2/local/io.unitycatalog/unitycatalog-spark_2.12/0.2.0/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-07-28 09:44:54.014000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.014000 [ 653 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.014000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.014000 [ 653 ] DEBUG : Stderr: /root/.ivy2/local/io.unitycatalog/unitycatalog-spark_2.12/0.2.0/jars/unitycatalog-spark_2.12.jar (cluster.py:147, run_and_check) 2025-07-28 09:44:54.015000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.015000 [ 653 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-07-28 09:44:54.015000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.015000 [ 653 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-07-28 09:44:54.015000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.015000 [ 653 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.015000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.015000 [ 653 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-07-28 09:44:54.016000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.016000 [ 653 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-07-28 09:44:54.016000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.016000 [ 653 ] DEBUG : Stderr: https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-07-28 09:44:54.016000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.016000 [ 653 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.016000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.016000 [ 653 ] DEBUG : Stderr: https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-07-28 09:44:54.016000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.017000 [ 653 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.017000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.017000 [ 653 ] DEBUG : Stderr: :: UNRESOLVED DEPENDENCIES :: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.017000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.017000 [ 653 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.017000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.017000 [ 653 ] DEBUG : Stderr: :: org.apache.hadoop#hadoop-aws;3.3.4: not found (cluster.py:147, run_and_check) 2025-07-28 09:44:54.017000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.017000 [ 653 ] DEBUG : Stderr: :: io.delta#delta-spark_2.12;3.2.1: not found (cluster.py:147, run_and_check) 2025-07-28 09:44:54.018000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.018000 [ 653 ] DEBUG : Stderr: :: io.unitycatalog#unitycatalog-spark_2.12;0.2.0: not found (cluster.py:147, run_and_check) 2025-07-28 09:44:54.018000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.018000 [ 653 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.018000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.018000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.018000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:44:54.018000 [ 653 ] DEBUG : Stderr::: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS (cluster.py:147, run_and_check) 2025-07-28 09:44:54.018000 [ 653 ] DEBUG : Stderr:Exception in thread "main" java.lang.RuntimeException: [unresolved dependency: org.apache.hadoop#hadoop-aws;3.3.4: not found, unresolved dependency: io.delta#delta-spark_2.12;3.2.1: not found, unresolved dependency: io.unitycatalog#unitycatalog-spark_2.12;0.2.0: not found] (cluster.py:147, run_and_check) 2025-07-28 09:44:54.019000 [ 653 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1613) (cluster.py:147, run_and_check) 2025-07-28 09:44:54.019000 [ 653 ] DEBUG : Stderr: at org.apache.spark.util.DependencyUtils$.resolveMavenDependencies(DependencyUtils.scala:185) (cluster.py:147, run_and_check) 2025-07-28 09:44:54.019000 [ 653 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:339) (cluster.py:147, run_and_check) 2025-07-28 09:44:54.019000 [ 653 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:969) (cluster.py:147, run_and_check) 2025-07-28 09:44:54.019000 [ 653 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:199) (cluster.py:147, run_and_check) 2025-07-28 09:44:54.019000 [ 653 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:222) (cluster.py:147, run_and_check) 2025-07-28 09:44:54.019000 [ 653 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:91) (cluster.py:147, run_and_check) 2025-07-28 09:44:54.019000 [ 653 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1125) (cluster.py:147, run_and_check) 2025-07-28 09:44:54.020000 [ 653 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1134) (cluster.py:147, run_and_check) 2025-07-28 09:44:54.020000 [ 653 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) (cluster.py:147, run_and_check) 2025-07-28 09:44:54.020000 [ 653 ] DEBUG : Exitcode:1 (cluster.py:149, run_and_check) 2025-07-28 09:44:54.020000 [ 653 ] DEBUG : run container_id:roottestdatabasedelta-gw1-node1-1 detach:False nothrow:True cmd: ['bash', '-c', '\ncd /spark-3.5.4-bin-hadoop3 && bin/spark-sql --name "s3-uc-test" \\\n --master "local[*]" \\\n --packages "org.apache.hadoop:hadoop-aws:3.3.4,io.delta:delta-spark_2.12:3.2.1,io.unitycatalog:unitycatalog-spark_2.12:0.2.0" \\\n --conf "spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension" \\\n --conf "spark.sql.catalog.spark_catalog=io.unitycatalog.spark.UCSingleCatalog" \\\n --conf "spark.hadoop.fs.s3.impl=org.apache.hadoop.fs.s3a.S3AFileSystem" \\\n --conf "spark.sql.catalog.unity=io.unitycatalog.spark.UCSingleCatalog" \\\n --conf "spark.sql.catalog.unity.uri=http://localhost:8080" \\\n --conf "spark.sql.catalog.unity.token=" \\\n --conf "spark.sql.defaultCatalog=unity" \\\n -S -e "insert into schema_with_complex_tables.complex_table SELECT to_date(\'2024-10-01\', \'yyyy-MM-dd\'), to_timestamp(\'2024-10-01 00:12:00\'), array(42, 123, 77), map(7, \'v7\', 5, \'v5\'), named_struct(\\"f1\\", 34, \\"f2\\", \'hello\')" | grep -v \'loading settings\'\n'] (cluster.py:2051, exec_in_container) 2025-07-28 09:44:54.020000 [ 653 ] DEBUG : Command:[docker exec roottestdatabasedelta-gw1-node1-1 bash -c cd /spark-3.5.4-bin-hadoop3 && bin/spark-sql --name "s3-uc-test" \ --master "local[*]" \ --packages "org.apache.hadoop:hadoop-aws:3.3.4,io.delta:delta-spark_2.12:3.2.1,io.unitycatalog:unitycatalog-spark_2.12:0.2.0" \ --conf "spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension" \ --conf "spark.sql.catalog.spark_catalog=io.unitycatalog.spark.UCSingleCatalog" \ --conf "spark.hadoop.fs.s3.impl=org.apache.hadoop.fs.s3a.S3AFileSystem" \ --conf "spark.sql.catalog.unity=io.unitycatalog.spark.UCSingleCatalog" \ --conf "spark.sql.catalog.unity.uri=http://localhost:8080" \ --conf "spark.sql.catalog.unity.token=" \ --conf "spark.sql.defaultCatalog=unity" \ -S -e "insert into schema_with_complex_tables.complex_table SELECT to_date('2024-10-01', 'yyyy-MM-dd'), to_timestamp('2024-10-01 00:12:00'), array(42, 123, 77), map(7, 'v7', 5, 'v5'), named_struct(\"f1\", 34, \"f2\", 'hello')" | grep -v 'loading settings' ] (cluster.py:121, run_and_check) 2025-07-28 09:45:00.148000 [ 653 ] DEBUG : Stderr:Ivy Default Cache set to: /root/.ivy2/cache (cluster.py:147, run_and_check) 2025-07-28 09:45:00.149000 [ 653 ] DEBUG : Stderr:The jars for the packages stored in: /root/.ivy2/jars (cluster.py:147, run_and_check) 2025-07-28 09:45:00.149000 [ 653 ] DEBUG : Stderr:org.apache.hadoop#hadoop-aws added as a dependency (cluster.py:147, run_and_check) 2025-07-28 09:45:00.149000 [ 653 ] DEBUG : Stderr:io.delta#delta-spark_2.12 added as a dependency (cluster.py:147, run_and_check) 2025-07-28 09:45:00.149000 [ 653 ] DEBUG : Stderr:io.unitycatalog#unitycatalog-spark_2.12 added as a dependency (cluster.py:147, run_and_check) 2025-07-28 09:45:00.149000 [ 653 ] DEBUG : Stderr::: resolving dependencies :: org.apache.spark#spark-submit-parent-be03c639-9a48-41b3-8ef9-896e9cf4b359;1.0 (cluster.py:147, run_and_check) 2025-07-28 09:45:00.150000 [ 653 ] DEBUG : Stderr: confs: [default] (cluster.py:147, run_and_check) 2025-07-28 09:45:00.150000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:45:00.150000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:45:00.150000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:45:00.150000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:45:00.150000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:45:00.151000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:45:00.151000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:45:00.151000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:45:00.151000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:45:00.151000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:45:00.151000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:45:00.151000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:45:00.152000 [ 653 ] DEBUG : Stderr::: resolution report :: resolve 4337ms :: artifacts dl 1ms (cluster.py:147, run_and_check) 2025-07-28 09:45:00.152000 [ 653 ] DEBUG : Stderr: :: modules in use: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.152000 [ 653 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-07-28 09:45:00.152000 [ 653 ] DEBUG : Stderr: | | modules || artifacts | (cluster.py:147, run_and_check) 2025-07-28 09:45:00.152000 [ 653 ] DEBUG : Stderr: | conf | number| search|dwnlded|evicted|| number|dwnlded| (cluster.py:147, run_and_check) 2025-07-28 09:45:00.152000 [ 653 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-07-28 09:45:00.152000 [ 653 ] DEBUG : Stderr: | default | 3 | 0 | 0 | 0 || 0 | 0 | (cluster.py:147, run_and_check) 2025-07-28 09:45:00.153000 [ 653 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-07-28 09:45:00.153000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.153000 [ 653 ] DEBUG : Stderr::: problems summary :: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.153000 [ 653 ] DEBUG : Stderr::::: WARNINGS (cluster.py:147, run_and_check) 2025-07-28 09:45:00.153000 [ 653 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-07-28 09:45:00.153000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.154000 [ 653 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:00.154000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.154000 [ 653 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-07-28 09:45:00.154000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.154000 [ 653 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:00.154000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.154000 [ 653 ] DEBUG : Stderr: module not found: org.apache.hadoop#hadoop-aws;3.3.4 (cluster.py:147, run_and_check) 2025-07-28 09:45:00.155000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.155000 [ 653 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-07-28 09:45:00.155000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.155000 [ 653 ] DEBUG : Stderr: file:/root/.m2/repository/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-07-28 09:45:00.156000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.156000 [ 653 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.156000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.156000 [ 653 ] DEBUG : Stderr: file:/root/.m2/repository/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:00.156000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.156000 [ 653 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-07-28 09:45:00.156000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.157000 [ 653 ] DEBUG : Stderr: /root/.ivy2/local/org.apache.hadoop/hadoop-aws/3.3.4/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-07-28 09:45:00.157000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.157000 [ 653 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.157000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.157000 [ 653 ] DEBUG : Stderr: /root/.ivy2/local/org.apache.hadoop/hadoop-aws/3.3.4/jars/hadoop-aws.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:00.157000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.157000 [ 653 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-07-28 09:45:00.158000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.158000 [ 653 ] DEBUG : Stderr: https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-07-28 09:45:00.158000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.158000 [ 653 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.158000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.158000 [ 653 ] DEBUG : Stderr: https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:00.159000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.159000 [ 653 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-07-28 09:45:00.159000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.159000 [ 653 ] DEBUG : Stderr: https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-07-28 09:45:00.159000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.159000 [ 653 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.159000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.160000 [ 653 ] DEBUG : Stderr: https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:00.160000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.160000 [ 653 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-07-28 09:45:00.160000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.160000 [ 653 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:00.160000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.160000 [ 653 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-07-28 09:45:00.160000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.161000 [ 653 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:00.161000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.161000 [ 653 ] DEBUG : Stderr: module not found: io.delta#delta-spark_2.12;3.2.1 (cluster.py:147, run_and_check) 2025-07-28 09:45:00.161000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.161000 [ 653 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-07-28 09:45:00.161000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.161000 [ 653 ] DEBUG : Stderr: file:/root/.m2/repository/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-07-28 09:45:00.161000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.161000 [ 653 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.162000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.162000 [ 653 ] DEBUG : Stderr: file:/root/.m2/repository/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:00.162000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.162000 [ 653 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-07-28 09:45:00.162000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.162000 [ 653 ] DEBUG : Stderr: /root/.ivy2/local/io.delta/delta-spark_2.12/3.2.1/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-07-28 09:45:00.162000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.162000 [ 653 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.162000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.163000 [ 653 ] DEBUG : Stderr: /root/.ivy2/local/io.delta/delta-spark_2.12/3.2.1/jars/delta-spark_2.12.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:00.163000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.163000 [ 653 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-07-28 09:45:00.163000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.163000 [ 653 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-07-28 09:45:00.163000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.163000 [ 653 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.163000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.164000 [ 653 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:00.164000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.164000 [ 653 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-07-28 09:45:00.164000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.164000 [ 653 ] DEBUG : Stderr: https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-07-28 09:45:00.164000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.164000 [ 653 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.164000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.164000 [ 653 ] DEBUG : Stderr: https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:00.165000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.165000 [ 653 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-07-28 09:45:00.165000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.165000 [ 653 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:00.165000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.165000 [ 653 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-07-28 09:45:00.165000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.165000 [ 653 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:00.165000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.166000 [ 653 ] DEBUG : Stderr: module not found: io.unitycatalog#unitycatalog-spark_2.12;0.2.0 (cluster.py:147, run_and_check) 2025-07-28 09:45:00.166000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.166000 [ 653 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-07-28 09:45:00.166000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.166000 [ 653 ] DEBUG : Stderr: file:/root/.m2/repository/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-07-28 09:45:00.166000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.166000 [ 653 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.166000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.167000 [ 653 ] DEBUG : Stderr: file:/root/.m2/repository/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:00.167000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.167000 [ 653 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-07-28 09:45:00.167000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.167000 [ 653 ] DEBUG : Stderr: /root/.ivy2/local/io.unitycatalog/unitycatalog-spark_2.12/0.2.0/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-07-28 09:45:00.167000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.167000 [ 653 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.167000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.167000 [ 653 ] DEBUG : Stderr: /root/.ivy2/local/io.unitycatalog/unitycatalog-spark_2.12/0.2.0/jars/unitycatalog-spark_2.12.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:00.168000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.168000 [ 653 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-07-28 09:45:00.168000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.168000 [ 653 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-07-28 09:45:00.168000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.168000 [ 653 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.168000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.168000 [ 653 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:00.168000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.169000 [ 653 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-07-28 09:45:00.169000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.169000 [ 653 ] DEBUG : Stderr: https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-07-28 09:45:00.169000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.169000 [ 653 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.169000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.169000 [ 653 ] DEBUG : Stderr: https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:00.169000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.170000 [ 653 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.170000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.170000 [ 653 ] DEBUG : Stderr: :: UNRESOLVED DEPENDENCIES :: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.170000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.170000 [ 653 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.170000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.170000 [ 653 ] DEBUG : Stderr: :: org.apache.hadoop#hadoop-aws;3.3.4: not found (cluster.py:147, run_and_check) 2025-07-28 09:45:00.170000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.170000 [ 653 ] DEBUG : Stderr: :: io.delta#delta-spark_2.12;3.2.1: not found (cluster.py:147, run_and_check) 2025-07-28 09:45:00.170000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.171000 [ 653 ] DEBUG : Stderr: :: io.unitycatalog#unitycatalog-spark_2.12;0.2.0: not found (cluster.py:147, run_and_check) 2025-07-28 09:45:00.171000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.171000 [ 653 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.171000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.171000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.171000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:00.171000 [ 653 ] DEBUG : Stderr::: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS (cluster.py:147, run_and_check) 2025-07-28 09:45:00.171000 [ 653 ] DEBUG : Stderr:Exception in thread "main" java.lang.RuntimeException: [unresolved dependency: org.apache.hadoop#hadoop-aws;3.3.4: not found, unresolved dependency: io.delta#delta-spark_2.12;3.2.1: not found, unresolved dependency: io.unitycatalog#unitycatalog-spark_2.12;0.2.0: not found] (cluster.py:147, run_and_check) 2025-07-28 09:45:00.172000 [ 653 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1613) (cluster.py:147, run_and_check) 2025-07-28 09:45:00.172000 [ 653 ] DEBUG : Stderr: at org.apache.spark.util.DependencyUtils$.resolveMavenDependencies(DependencyUtils.scala:185) (cluster.py:147, run_and_check) 2025-07-28 09:45:00.172000 [ 653 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:339) (cluster.py:147, run_and_check) 2025-07-28 09:45:00.172000 [ 653 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:969) (cluster.py:147, run_and_check) 2025-07-28 09:45:00.172000 [ 653 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:199) (cluster.py:147, run_and_check) 2025-07-28 09:45:00.172000 [ 653 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:222) (cluster.py:147, run_and_check) 2025-07-28 09:45:00.172000 [ 653 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:91) (cluster.py:147, run_and_check) 2025-07-28 09:45:00.172000 [ 653 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1125) (cluster.py:147, run_and_check) 2025-07-28 09:45:00.172000 [ 653 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1134) (cluster.py:147, run_and_check) 2025-07-28 09:45:00.173000 [ 653 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) (cluster.py:147, run_and_check) 2025-07-28 09:45:00.173000 [ 653 ] DEBUG : Exitcode:1 (cluster.py:149, run_and_check) 2025-07-28 09:45:00.173000 [ 653 ] DEBUG : Executing query create database complex_schema engine DataLakeCatalog('http://localhost:8080/api/2.1/unity-catalog') settings warehouse = 'unity', catalog_type='unity', vended_credentials=false on node1 (cluster.py:3648, query) 2025-07-28 09:45:00.490000 [ 653 ] DEBUG : Executing query SHOW TABLES FROM complex_schema LIKE 'schema_with_complex_tables%' on node1 (cluster.py:3648, query) 2025-07-28 09:45:01.660000 [ 653 ] DEBUG : Executing query SHOW CREATE TABLE complex_schema.`schema_with_complex_tables.complex_table` on node1 (cluster.py:3648, query) ______________________ test_embedded_database_and_tables _______________________ [gw1] linux -- Python 3.10.12 /usr/bin/python3 started_cluster = def test_embedded_database_and_tables(started_cluster): node1 = started_cluster.instances['node1'] node1.query("create database unity_test engine DataLakeCatalog('http://localhost:8080/api/2.1/unity-catalog') settings warehouse = 'unity', catalog_type='unity', vended_credentials=false", settings={"allow_experimental_database_unity_catalog": "1"}) default_tables = list(sorted(node1.query("SHOW TABLES FROM unity_test LIKE 'default%'", settings={'use_hive_partitioning':'0'}).strip().split('\n'))) print("Default tables", default_tables) assert default_tables == ['default.marksheet', 'default.marksheet_uniform', 'default.numbers', 'default.user_countries'] for table in default_tables: if table == "default.marksheet_uniform": continue assert "DeltaLake" in node1.query(f"show create table unity_test.`{table}`") if table in ('default.marksheet', 'default.user_countries'): data_clickhouse = TSV(node1.query(f"SELECT * FROM unity_test.`{table}` ORDER BY 1,2,3")) > data_spark = TSV(execute_spark_query(node1, f"SELECT * FROM unity.{table} ORDER BY 1,2,3")) test_database_delta/test.py:90: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test_database_delta/test.py:54: in execute_spark_query return node.exec_in_container( helpers/cluster.py:4117: in exec_in_container return self.cluster.exec_in_container( helpers/cluster.py:2069: in exec_in_container result = subprocess_check_call( helpers/cluster.py:239: in subprocess_check_call return run_and_check(args, detach=detach, nothrow=nothrow, **kwargs) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ['docker', 'exec', 'roottestdatabasedelta-gw1-node1-1', 'bash', '-c', '\ncd /spark-3.5.4-bin-hadoop3 && bin/spark-sql ...tCatalog=unity" \\\n -S -e "SELECT * FROM unity.default.marksheet ORDER BY 1,2,3" | grep -v \'loading settings\'\n'] env = None, shell = False, stdout = -1, stderr = -1, timeout = 300 nothrow = False, detach = False def run_and_check( args: Union[Sequence[str], str], env=None, shell=False, stdout=subprocess.PIPE, stderr=subprocess.PIPE, timeout=300, nothrow=False, detach=False, ) -> str: if shell: if isinstance(args, str): shell_args = args else: shell_args = next(a for a in args) else: shell_args = " ".join(args) logging.debug("Command:[%s]", shell_args) if detach: subprocess.Popen( args, stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL, env=env, shell=shell, ) return "" res = subprocess.run( args, stdout=stdout, stderr=stderr, env=env, shell=shell, timeout=timeout, check=False, ) out = res.stdout.decode("utf-8", "ignore") err = res.stderr.decode("utf-8", "ignore") # check_call(...) from subprocess does not print stderr, so we do it manually for outline in out.splitlines(): logging.debug("Stdout:%s", outline) for errline in err.splitlines(): logging.debug("Stderr:%s", errline) if res.returncode != 0: logging.debug("Exitcode:%s", res.returncode) if env: logging.debug("Env:%s", env) if not nothrow: > raise Exception( f"Command [{shell_args}] return non-zero code {res.returncode}: {res.stderr.decode('utf-8')}" ) E Exception: Command [docker exec roottestdatabasedelta-gw1-node1-1 bash -c E cd /spark-3.5.4-bin-hadoop3 && bin/spark-sql --name "s3-uc-test" \ E --master "local[*]" \ E --packages "org.apache.hadoop:hadoop-aws:3.3.4,io.delta:delta-spark_2.12:3.2.1,io.unitycatalog:unitycatalog-spark_2.12:0.2.0" \ E --conf "spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension" \ E --conf "spark.sql.catalog.spark_catalog=io.unitycatalog.spark.UCSingleCatalog" \ E --conf "spark.hadoop.fs.s3.impl=org.apache.hadoop.fs.s3a.S3AFileSystem" \ E --conf "spark.sql.catalog.unity=io.unitycatalog.spark.UCSingleCatalog" \ E --conf "spark.sql.catalog.unity.uri=http://localhost:8080" \ E --conf "spark.sql.catalog.unity.token=" \ E --conf "spark.sql.defaultCatalog=unity" \ E -S -e "SELECT * FROM unity.default.marksheet ORDER BY 1,2,3" | grep -v 'loading settings' E ] return non-zero code 1: Ivy Default Cache set to: /root/.ivy2/cache E The jars for the packages stored in: /root/.ivy2/jars E org.apache.hadoop#hadoop-aws added as a dependency E io.delta#delta-spark_2.12 added as a dependency E io.unitycatalog#unitycatalog-spark_2.12 added as a dependency E :: resolving dependencies :: org.apache.spark#spark-submit-parent-1220f5a5-7b09-42dd-bcd8-0a6bd700db79;1.0 E confs: [default] E You probably access the destination server through a proxy server that is not well configured. E You probably access the destination server through a proxy server that is not well configured. E You probably access the destination server through a proxy server that is not well configured. E You probably access the destination server through a proxy server that is not well configured. E You probably access the destination server through a proxy server that is not well configured. E You probably access the destination server through a proxy server that is not well configured. E You probably access the destination server through a proxy server that is not well configured. E You probably access the destination server through a proxy server that is not well configured. E You probably access the destination server through a proxy server that is not well configured. E You probably access the destination server through a proxy server that is not well configured. E You probably access the destination server through a proxy server that is not well configured. E You probably access the destination server through a proxy server that is not well configured. E :: resolution report :: resolve 4287ms :: artifacts dl 0ms E :: modules in use: E --------------------------------------------------------------------- E | | modules || artifacts | E | conf | number| search|dwnlded|evicted|| number|dwnlded| E --------------------------------------------------------------------- E | default | 3 | 0 | 0 | 0 || 0 | 0 | E --------------------------------------------------------------------- E E :: problems summary :: E :::: WARNINGS E Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom E E Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar E E Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom E E Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar E E module not found: org.apache.hadoop#hadoop-aws;3.3.4 E E ==== local-m2-cache: tried E E file:/root/.m2/repository/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom E E -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: E E file:/root/.m2/repository/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar E E ==== local-ivy-cache: tried E E /root/.ivy2/local/org.apache.hadoop/hadoop-aws/3.3.4/ivys/ivy.xml E E -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: E E /root/.ivy2/local/org.apache.hadoop/hadoop-aws/3.3.4/jars/hadoop-aws.jar E E ==== central: tried E E https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom E E -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: E E https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar E E ==== spark-packages: tried E E https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom E E -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: E E https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar E E Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom E E Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar E E Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom E E Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar E E module not found: io.delta#delta-spark_2.12;3.2.1 E E ==== local-m2-cache: tried E E file:/root/.m2/repository/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom E E -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: E E file:/root/.m2/repository/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar E E ==== local-ivy-cache: tried E E /root/.ivy2/local/io.delta/delta-spark_2.12/3.2.1/ivys/ivy.xml E E -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: E E /root/.ivy2/local/io.delta/delta-spark_2.12/3.2.1/jars/delta-spark_2.12.jar E E ==== central: tried E E https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom E E -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: E E https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar E E ==== spark-packages: tried E E https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom E E -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: E E https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar E E Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom E E Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar E E Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom E E Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar E E module not found: io.unitycatalog#unitycatalog-spark_2.12;0.2.0 E E ==== local-m2-cache: tried E E file:/root/.m2/repository/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom E E -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: E E file:/root/.m2/repository/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar E E ==== local-ivy-cache: tried E E /root/.ivy2/local/io.unitycatalog/unitycatalog-spark_2.12/0.2.0/ivys/ivy.xml E E -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: E E /root/.ivy2/local/io.unitycatalog/unitycatalog-spark_2.12/0.2.0/jars/unitycatalog-spark_2.12.jar E E ==== central: tried E E https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom E E -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: E E https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar E E ==== spark-packages: tried E E https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom E E -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: E E https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar E E :::::::::::::::::::::::::::::::::::::::::::::: E E :: UNRESOLVED DEPENDENCIES :: E E :::::::::::::::::::::::::::::::::::::::::::::: E E :: org.apache.hadoop#hadoop-aws;3.3.4: not found E E :: io.delta#delta-spark_2.12;3.2.1: not found E E :: io.unitycatalog#unitycatalog-spark_2.12;0.2.0: not found E E :::::::::::::::::::::::::::::::::::::::::::::: E E E E :: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS E Exception in thread "main" java.lang.RuntimeException: [unresolved dependency: org.apache.hadoop#hadoop-aws;3.3.4: not found, unresolved dependency: io.delta#delta-spark_2.12;3.2.1: not found, unresolved dependency: io.unitycatalog#unitycatalog-spark_2.12;0.2.0: not found] E at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1613) E at org.apache.spark.util.DependencyUtils$.resolveMavenDependencies(DependencyUtils.scala:185) E at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:339) E at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:969) E at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:199) E at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:222) E at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:91) E at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1125) E at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1134) E at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) helpers/cluster.py:153: Exception ----------------------------- Captured stdout call ----------------------------- Default tables ['default.marksheet', 'default.marksheet_uniform', 'default.numbers', 'default.user_countries'] ------------------------------ Captured log call ------------------------------- 2025-07-28 09:45:02.304000 [ 653 ] DEBUG : Executing query create database unity_test engine DataLakeCatalog('http://localhost:8080/api/2.1/unity-catalog') settings warehouse = 'unity', catalog_type='unity', vended_credentials=false on node1 (cluster.py:3648, query) 2025-07-28 09:45:02.571000 [ 653 ] DEBUG : Executing query SHOW TABLES FROM unity_test LIKE 'default%' on node1 (cluster.py:3648, query) 2025-07-28 09:45:03.190000 [ 653 ] DEBUG : Executing query show create table unity_test.`default.marksheet` on node1 (cluster.py:3648, query) 2025-07-28 09:45:03.558000 [ 653 ] DEBUG : Executing query SELECT * FROM unity_test.`default.marksheet` ORDER BY 1,2,3 on node1 (cluster.py:3648, query) 2025-07-28 09:45:03.925000 [ 653 ] DEBUG : run container_id:roottestdatabasedelta-gw1-node1-1 detach:False nothrow:False cmd: ['bash', '-c', '\ncd /spark-3.5.4-bin-hadoop3 && bin/spark-sql --name "s3-uc-test" \\\n --master "local[*]" \\\n --packages "org.apache.hadoop:hadoop-aws:3.3.4,io.delta:delta-spark_2.12:3.2.1,io.unitycatalog:unitycatalog-spark_2.12:0.2.0" \\\n --conf "spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension" \\\n --conf "spark.sql.catalog.spark_catalog=io.unitycatalog.spark.UCSingleCatalog" \\\n --conf "spark.hadoop.fs.s3.impl=org.apache.hadoop.fs.s3a.S3AFileSystem" \\\n --conf "spark.sql.catalog.unity=io.unitycatalog.spark.UCSingleCatalog" \\\n --conf "spark.sql.catalog.unity.uri=http://localhost:8080" \\\n --conf "spark.sql.catalog.unity.token=" \\\n --conf "spark.sql.defaultCatalog=unity" \\\n -S -e "SELECT * FROM unity.default.marksheet ORDER BY 1,2,3" | grep -v \'loading settings\'\n'] (cluster.py:2051, exec_in_container) 2025-07-28 09:45:03.926000 [ 653 ] DEBUG : Command:[docker exec roottestdatabasedelta-gw1-node1-1 bash -c cd /spark-3.5.4-bin-hadoop3 && bin/spark-sql --name "s3-uc-test" \ --master "local[*]" \ --packages "org.apache.hadoop:hadoop-aws:3.3.4,io.delta:delta-spark_2.12:3.2.1,io.unitycatalog:unitycatalog-spark_2.12:0.2.0" \ --conf "spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension" \ --conf "spark.sql.catalog.spark_catalog=io.unitycatalog.spark.UCSingleCatalog" \ --conf "spark.hadoop.fs.s3.impl=org.apache.hadoop.fs.s3a.S3AFileSystem" \ --conf "spark.sql.catalog.unity=io.unitycatalog.spark.UCSingleCatalog" \ --conf "spark.sql.catalog.unity.uri=http://localhost:8080" \ --conf "spark.sql.catalog.unity.token=" \ --conf "spark.sql.defaultCatalog=unity" \ -S -e "SELECT * FROM unity.default.marksheet ORDER BY 1,2,3" | grep -v 'loading settings' ] (cluster.py:121, run_and_check) 2025-07-28 09:45:10.104000 [ 653 ] DEBUG : Stderr:Ivy Default Cache set to: /root/.ivy2/cache (cluster.py:147, run_and_check) 2025-07-28 09:45:10.104000 [ 653 ] DEBUG : Stderr:The jars for the packages stored in: /root/.ivy2/jars (cluster.py:147, run_and_check) 2025-07-28 09:45:10.104000 [ 653 ] DEBUG : Stderr:org.apache.hadoop#hadoop-aws added as a dependency (cluster.py:147, run_and_check) 2025-07-28 09:45:10.104000 [ 653 ] DEBUG : Stderr:io.delta#delta-spark_2.12 added as a dependency (cluster.py:147, run_and_check) 2025-07-28 09:45:10.105000 [ 653 ] DEBUG : Stderr:io.unitycatalog#unitycatalog-spark_2.12 added as a dependency (cluster.py:147, run_and_check) 2025-07-28 09:45:10.105000 [ 653 ] DEBUG : Stderr::: resolving dependencies :: org.apache.spark#spark-submit-parent-1220f5a5-7b09-42dd-bcd8-0a6bd700db79;1.0 (cluster.py:147, run_and_check) 2025-07-28 09:45:10.105000 [ 653 ] DEBUG : Stderr: confs: [default] (cluster.py:147, run_and_check) 2025-07-28 09:45:10.105000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:45:10.105000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:45:10.105000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:45:10.105000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:45:10.105000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:45:10.105000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:45:10.105000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:45:10.105000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:45:10.105000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:45:10.105000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:45:10.105000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:45:10.106000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:45:10.106000 [ 653 ] DEBUG : Stderr::: resolution report :: resolve 4287ms :: artifacts dl 0ms (cluster.py:147, run_and_check) 2025-07-28 09:45:10.106000 [ 653 ] DEBUG : Stderr: :: modules in use: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.106000 [ 653 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-07-28 09:45:10.106000 [ 653 ] DEBUG : Stderr: | | modules || artifacts | (cluster.py:147, run_and_check) 2025-07-28 09:45:10.106000 [ 653 ] DEBUG : Stderr: | conf | number| search|dwnlded|evicted|| number|dwnlded| (cluster.py:147, run_and_check) 2025-07-28 09:45:10.106000 [ 653 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-07-28 09:45:10.106000 [ 653 ] DEBUG : Stderr: | default | 3 | 0 | 0 | 0 || 0 | 0 | (cluster.py:147, run_and_check) 2025-07-28 09:45:10.106000 [ 653 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-07-28 09:45:10.106000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.106000 [ 653 ] DEBUG : Stderr::: problems summary :: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.106000 [ 653 ] DEBUG : Stderr::::: WARNINGS (cluster.py:147, run_and_check) 2025-07-28 09:45:10.106000 [ 653 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-07-28 09:45:10.106000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.107000 [ 653 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:10.107000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.107000 [ 653 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-07-28 09:45:10.107000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.107000 [ 653 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:10.107000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.107000 [ 653 ] DEBUG : Stderr: module not found: org.apache.hadoop#hadoop-aws;3.3.4 (cluster.py:147, run_and_check) 2025-07-28 09:45:10.107000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.107000 [ 653 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-07-28 09:45:10.107000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.107000 [ 653 ] DEBUG : Stderr: file:/root/.m2/repository/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-07-28 09:45:10.107000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.108000 [ 653 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.108000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.108000 [ 653 ] DEBUG : Stderr: file:/root/.m2/repository/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:10.108000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.108000 [ 653 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-07-28 09:45:10.108000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.108000 [ 653 ] DEBUG : Stderr: /root/.ivy2/local/org.apache.hadoop/hadoop-aws/3.3.4/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-07-28 09:45:10.108000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.108000 [ 653 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.108000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.108000 [ 653 ] DEBUG : Stderr: /root/.ivy2/local/org.apache.hadoop/hadoop-aws/3.3.4/jars/hadoop-aws.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:10.108000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.108000 [ 653 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-07-28 09:45:10.108000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.108000 [ 653 ] DEBUG : Stderr: https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-07-28 09:45:10.109000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.109000 [ 653 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.109000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.109000 [ 653 ] DEBUG : Stderr: https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:10.109000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.109000 [ 653 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-07-28 09:45:10.109000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.109000 [ 653 ] DEBUG : Stderr: https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-07-28 09:45:10.109000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.109000 [ 653 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.109000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.109000 [ 653 ] DEBUG : Stderr: https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:10.109000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.109000 [ 653 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-07-28 09:45:10.110000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.110000 [ 653 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:10.110000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.110000 [ 653 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-07-28 09:45:10.110000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.110000 [ 653 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:10.110000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.110000 [ 653 ] DEBUG : Stderr: module not found: io.delta#delta-spark_2.12;3.2.1 (cluster.py:147, run_and_check) 2025-07-28 09:45:10.110000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.110000 [ 653 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-07-28 09:45:10.110000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.110000 [ 653 ] DEBUG : Stderr: file:/root/.m2/repository/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-07-28 09:45:10.110000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.110000 [ 653 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.111000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.111000 [ 653 ] DEBUG : Stderr: file:/root/.m2/repository/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:10.111000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.111000 [ 653 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-07-28 09:45:10.111000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.111000 [ 653 ] DEBUG : Stderr: /root/.ivy2/local/io.delta/delta-spark_2.12/3.2.1/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-07-28 09:45:10.111000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.111000 [ 653 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.111000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.111000 [ 653 ] DEBUG : Stderr: /root/.ivy2/local/io.delta/delta-spark_2.12/3.2.1/jars/delta-spark_2.12.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:10.111000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.111000 [ 653 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-07-28 09:45:10.111000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.112000 [ 653 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-07-28 09:45:10.112000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.112000 [ 653 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.112000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.112000 [ 653 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:10.112000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.112000 [ 653 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-07-28 09:45:10.112000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.112000 [ 653 ] DEBUG : Stderr: https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-07-28 09:45:10.112000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.112000 [ 653 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.112000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.112000 [ 653 ] DEBUG : Stderr: https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:10.112000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.113000 [ 653 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-07-28 09:45:10.113000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.113000 [ 653 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:10.113000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.113000 [ 653 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-07-28 09:45:10.113000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.113000 [ 653 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:10.113000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.113000 [ 653 ] DEBUG : Stderr: module not found: io.unitycatalog#unitycatalog-spark_2.12;0.2.0 (cluster.py:147, run_and_check) 2025-07-28 09:45:10.113000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.113000 [ 653 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-07-28 09:45:10.113000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.113000 [ 653 ] DEBUG : Stderr: file:/root/.m2/repository/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-07-28 09:45:10.114000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.114000 [ 653 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.114000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.114000 [ 653 ] DEBUG : Stderr: file:/root/.m2/repository/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:10.114000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.114000 [ 653 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-07-28 09:45:10.114000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.114000 [ 653 ] DEBUG : Stderr: /root/.ivy2/local/io.unitycatalog/unitycatalog-spark_2.12/0.2.0/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-07-28 09:45:10.114000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.114000 [ 653 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.114000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.114000 [ 653 ] DEBUG : Stderr: /root/.ivy2/local/io.unitycatalog/unitycatalog-spark_2.12/0.2.0/jars/unitycatalog-spark_2.12.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:10.114000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.114000 [ 653 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-07-28 09:45:10.115000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.115000 [ 653 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-07-28 09:45:10.115000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.115000 [ 653 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.115000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.115000 [ 653 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:10.115000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.115000 [ 653 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-07-28 09:45:10.115000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.115000 [ 653 ] DEBUG : Stderr: https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-07-28 09:45:10.115000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.115000 [ 653 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.115000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.115000 [ 653 ] DEBUG : Stderr: https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:10.116000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.116000 [ 653 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.116000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.116000 [ 653 ] DEBUG : Stderr: :: UNRESOLVED DEPENDENCIES :: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.116000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.116000 [ 653 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.116000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.116000 [ 653 ] DEBUG : Stderr: :: org.apache.hadoop#hadoop-aws;3.3.4: not found (cluster.py:147, run_and_check) 2025-07-28 09:45:10.116000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.116000 [ 653 ] DEBUG : Stderr: :: io.delta#delta-spark_2.12;3.2.1: not found (cluster.py:147, run_and_check) 2025-07-28 09:45:10.116000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.116000 [ 653 ] DEBUG : Stderr: :: io.unitycatalog#unitycatalog-spark_2.12;0.2.0: not found (cluster.py:147, run_and_check) 2025-07-28 09:45:10.116000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.116000 [ 653 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.116000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.116000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.117000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:10.117000 [ 653 ] DEBUG : Stderr::: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS (cluster.py:147, run_and_check) 2025-07-28 09:45:10.117000 [ 653 ] DEBUG : Stderr:Exception in thread "main" java.lang.RuntimeException: [unresolved dependency: org.apache.hadoop#hadoop-aws;3.3.4: not found, unresolved dependency: io.delta#delta-spark_2.12;3.2.1: not found, unresolved dependency: io.unitycatalog#unitycatalog-spark_2.12;0.2.0: not found] (cluster.py:147, run_and_check) 2025-07-28 09:45:10.117000 [ 653 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1613) (cluster.py:147, run_and_check) 2025-07-28 09:45:10.117000 [ 653 ] DEBUG : Stderr: at org.apache.spark.util.DependencyUtils$.resolveMavenDependencies(DependencyUtils.scala:185) (cluster.py:147, run_and_check) 2025-07-28 09:45:10.117000 [ 653 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:339) (cluster.py:147, run_and_check) 2025-07-28 09:45:10.117000 [ 653 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:969) (cluster.py:147, run_and_check) 2025-07-28 09:45:10.117000 [ 653 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:199) (cluster.py:147, run_and_check) 2025-07-28 09:45:10.117000 [ 653 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:222) (cluster.py:147, run_and_check) 2025-07-28 09:45:10.117000 [ 653 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:91) (cluster.py:147, run_and_check) 2025-07-28 09:45:10.117000 [ 653 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1125) (cluster.py:147, run_and_check) 2025-07-28 09:45:10.117000 [ 653 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1134) (cluster.py:147, run_and_check) 2025-07-28 09:45:10.117000 [ 653 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) (cluster.py:147, run_and_check) 2025-07-28 09:45:10.117000 [ 653 ] DEBUG : Exitcode:1 (cluster.py:149, run_and_check) _________________________ test_multiple_schemes_tables _________________________ [gw1] linux -- Python 3.10.12 /usr/bin/python3 started_cluster = def test_multiple_schemes_tables(started_cluster): node1 = started_cluster.instances['node1'] execute_multiple_spark_queries(node1, [f'CREATE SCHEMA test_schema{i}' for i in range(10)], True) execute_multiple_spark_queries(node1, [f'CREATE TABLE test_schema{i}.test_table{i} (col1 int, col2 double) using Delta location \'/tmp/test_schema{i}/test_table{i}\'' for i in range(10)], True) execute_multiple_spark_queries(node1, [f'INSERT INTO test_schema{i}.test_table{i} VALUES ({i}, {i}.0)' for i in range(10)], True) node1.query("create database multi_schema_test engine DataLakeCatalog('http://localhost:8080/api/2.1/unity-catalog') settings warehouse = 'unity', catalog_type='unity', vended_credentials=false", settings={"allow_experimental_database_unity_catalog": "1"}) multi_schema_tables = list(sorted(node1.query("SHOW TABLES FROM multi_schema_test LIKE 'test_schema%'", settings={'use_hive_partitioning':'0'}).strip().split('\n'))) print(multi_schema_tables) for i, table in enumerate(multi_schema_tables): > assert node1.query(f"SELECT col1 FROM multi_schema_test.`{table}`").strip() == str(i) test_database_delta/test.py:107: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ helpers/cluster.py:3649: in query return self.client.query( helpers/client.py:39: in wrap return func(self, *args, **kwargs) helpers/client.py:79: in query ).get_answer() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def get_answer(self): self.process.wait(timeout=DEFAULT_QUERY_TIMEOUT) self.stdout_file.seek(0) self.stderr_file.seek(0) stdout = self.stdout_file.read().decode("utf-8", errors="replace") stderr = self.stderr_file.read().decode("utf-8", errors="replace") if ( self.timer is not None and not self.process_finished_before_timeout and not self.ignore_error ): logging.debug(f"Timed out. Last stdout:{stdout}, stderr:{stderr}") raise QueryTimeoutExceedException("Client timed out!") if ( self.process.returncode != 0 or self.remove_trash_from_stderr(stderr) ) and not self.ignore_error: > raise QueryRuntimeException( "Client failed! Return code: {}, stderr: {}".format( self.process.returncode, stderr ), self.process.returncode, stderr, ) E helpers.client.QueryRuntimeException: Client failed! Return code: 62, stderr: Code: 62. DB::Exception: Syntax error: failed at position 36 (``): ``. Expected one of: Colon, Caret, identifier, end of query. (SYNTAX_ERROR), Stack trace (when copying this message, always include the lines below): E E 0. ./contrib/llvm-project/libcxx/include/__exception/exception.h:113: Poco::Exception::Exception(String const&, int) @ 0x00000000382e5051 E 1. ./build_docker/./src/Common/Exception.cpp:108: DB::Exception::Exception(DB::Exception::MessageMasked&&, int, bool) @ 0x000000001bd54ed1 E 2. DB::Exception::createDeprecated(String const&, int, bool) @ 0x000000000c3e5730 E 3. ./build_docker/./src/Parsers/parseQuery.cpp:411: DB::parseQueryAndMovePosition(DB::IParser&, char const*&, char const*, String const&, bool, unsigned long, unsigned long, unsigned long) @ 0x0000000031b59015 E 4. ./build_docker/./src/Client/ClientBase.cpp:403: DB::ClientBase::parseQuery(char const*&, char const*, DB::Settings const&, bool) @ 0x000000002fa0e829 E 5. ./build_docker/./src/Client/ClientBase.cpp:2370: DB::ClientBase::analyzeMultiQueryText(char const*&, char const*&, char const*, String&, std::shared_ptr&, String const&, std::unique_ptr>&) @ 0x000000002fa35967 E 6. ./build_docker/./src/Client/ClientBase.cpp:2508: DB::ClientBase::executeMultiQuery(String const&) @ 0x000000002fa37292 E 7. ./build_docker/./src/Client/ClientBase.cpp:2777: DB::ClientBase::processQueryText(String const&) @ 0x000000002fa3a67e E 8. ./build_docker/./src/Client/ClientBase.cpp:3430: DB::ClientBase::runNonInteractive() @ 0x000000002fa4ca3b E 9. ./build_docker/./programs/client/Client.cpp:407: DB::Client::main(std::vector> const&) @ 0x000000001c234277 E 10. ./build_docker/./base/poco/Util/src/Application.cpp:315: Poco::Util::Application::run() @ 0x0000000038522df7 E 11. ./build_docker/./programs/client/Client.cpp:1141: mainEntryClickHouseClient(int, char**) @ 0x000000001c24b309 E 12. ./build_docker/./programs/main.cpp:295: main @ 0x000000000c37d89f E 13. ? @ 0x00007fed2a9efd90 E 14. ? @ 0x00007fed2a9efe40 E 15. _start @ 0x000000000c2a602e helpers/client.py:248: QueryRuntimeException ----------------------------- Captured stdout call ----------------------------- [''] ------------------------------ Captured log call ------------------------------- 2025-07-28 09:45:10.301000 [ 653 ] DEBUG : run container_id:roottestdatabasedelta-gw1-node1-1 detach:False nothrow:True cmd: ['bash', '-c', '\ncd /spark-3.5.4-bin-hadoop3 && bin/spark-sql --name "s3-uc-test" \\\n --master "local[*]" \\\n --packages "org.apache.hadoop:hadoop-aws:3.3.4,io.delta:delta-spark_2.12:3.2.1,io.unitycatalog:unitycatalog-spark_2.12:0.2.0" \\\n --conf "spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension" \\\n --conf "spark.sql.catalog.spark_catalog=io.unitycatalog.spark.UCSingleCatalog" \\\n --conf "spark.hadoop.fs.s3.impl=org.apache.hadoop.fs.s3a.S3AFileSystem" \\\n --conf "spark.sql.catalog.unity=io.unitycatalog.spark.UCSingleCatalog" \\\n --conf "spark.sql.catalog.unity.uri=http://localhost:8080" \\\n --conf "spark.sql.catalog.unity.token=" \\\n --conf "spark.sql.defaultCatalog=unity" \\\n -S -e "CREATE SCHEMA test_schema0;CREATE SCHEMA test_schema1;CREATE SCHEMA test_schema2;CREATE SCHEMA test_schema3;CREATE SCHEMA test_schema4;CREATE SCHEMA test_schema5;CREATE SCHEMA test_schema6;CREATE SCHEMA test_schema7;CREATE SCHEMA test_schema8;CREATE SCHEMA test_schema9" | grep -v \'loading settings\'\n'] (cluster.py:2051, exec_in_container) 2025-07-28 09:45:10.301000 [ 653 ] DEBUG : Command:[docker exec roottestdatabasedelta-gw1-node1-1 bash -c cd /spark-3.5.4-bin-hadoop3 && bin/spark-sql --name "s3-uc-test" \ --master "local[*]" \ --packages "org.apache.hadoop:hadoop-aws:3.3.4,io.delta:delta-spark_2.12:3.2.1,io.unitycatalog:unitycatalog-spark_2.12:0.2.0" \ --conf "spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension" \ --conf "spark.sql.catalog.spark_catalog=io.unitycatalog.spark.UCSingleCatalog" \ --conf "spark.hadoop.fs.s3.impl=org.apache.hadoop.fs.s3a.S3AFileSystem" \ --conf "spark.sql.catalog.unity=io.unitycatalog.spark.UCSingleCatalog" \ --conf "spark.sql.catalog.unity.uri=http://localhost:8080" \ --conf "spark.sql.catalog.unity.token=" \ --conf "spark.sql.defaultCatalog=unity" \ -S -e "CREATE SCHEMA test_schema0;CREATE SCHEMA test_schema1;CREATE SCHEMA test_schema2;CREATE SCHEMA test_schema3;CREATE SCHEMA test_schema4;CREATE SCHEMA test_schema5;CREATE SCHEMA test_schema6;CREATE SCHEMA test_schema7;CREATE SCHEMA test_schema8;CREATE SCHEMA test_schema9" | grep -v 'loading settings' ] (cluster.py:121, run_and_check) 2025-07-28 09:45:16.316000 [ 653 ] DEBUG : Stderr:Ivy Default Cache set to: /root/.ivy2/cache (cluster.py:147, run_and_check) 2025-07-28 09:45:16.316000 [ 653 ] DEBUG : Stderr:The jars for the packages stored in: /root/.ivy2/jars (cluster.py:147, run_and_check) 2025-07-28 09:45:16.317000 [ 653 ] DEBUG : Stderr:org.apache.hadoop#hadoop-aws added as a dependency (cluster.py:147, run_and_check) 2025-07-28 09:45:16.317000 [ 653 ] DEBUG : Stderr:io.delta#delta-spark_2.12 added as a dependency (cluster.py:147, run_and_check) 2025-07-28 09:45:16.317000 [ 653 ] DEBUG : Stderr:io.unitycatalog#unitycatalog-spark_2.12 added as a dependency (cluster.py:147, run_and_check) 2025-07-28 09:45:16.317000 [ 653 ] DEBUG : Stderr::: resolving dependencies :: org.apache.spark#spark-submit-parent-92268e60-c479-48be-bb6a-8ae74bf4f921;1.0 (cluster.py:147, run_and_check) 2025-07-28 09:45:16.317000 [ 653 ] DEBUG : Stderr: confs: [default] (cluster.py:147, run_and_check) 2025-07-28 09:45:16.317000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:45:16.317000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:45:16.317000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:45:16.318000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:45:16.318000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:45:16.318000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:45:16.318000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:45:16.318000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:45:16.318000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:45:16.318000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:45:16.318000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:45:16.319000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:45:16.319000 [ 653 ] DEBUG : Stderr::: resolution report :: resolve 4313ms :: artifacts dl 0ms (cluster.py:147, run_and_check) 2025-07-28 09:45:16.319000 [ 653 ] DEBUG : Stderr: :: modules in use: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.319000 [ 653 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-07-28 09:45:16.319000 [ 653 ] DEBUG : Stderr: | | modules || artifacts | (cluster.py:147, run_and_check) 2025-07-28 09:45:16.319000 [ 653 ] DEBUG : Stderr: | conf | number| search|dwnlded|evicted|| number|dwnlded| (cluster.py:147, run_and_check) 2025-07-28 09:45:16.320000 [ 653 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-07-28 09:45:16.320000 [ 653 ] DEBUG : Stderr: | default | 3 | 0 | 0 | 0 || 0 | 0 | (cluster.py:147, run_and_check) 2025-07-28 09:45:16.320000 [ 653 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-07-28 09:45:16.320000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.320000 [ 653 ] DEBUG : Stderr::: problems summary :: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.320000 [ 653 ] DEBUG : Stderr::::: WARNINGS (cluster.py:147, run_and_check) 2025-07-28 09:45:16.320000 [ 653 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-07-28 09:45:16.320000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.321000 [ 653 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:16.321000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.321000 [ 653 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-07-28 09:45:16.321000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.321000 [ 653 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:16.321000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.321000 [ 653 ] DEBUG : Stderr: module not found: org.apache.hadoop#hadoop-aws;3.3.4 (cluster.py:147, run_and_check) 2025-07-28 09:45:16.321000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.321000 [ 653 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-07-28 09:45:16.321000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.321000 [ 653 ] DEBUG : Stderr: file:/root/.m2/repository/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-07-28 09:45:16.322000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.322000 [ 653 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.322000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.322000 [ 653 ] DEBUG : Stderr: file:/root/.m2/repository/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:16.322000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.322000 [ 653 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-07-28 09:45:16.322000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.322000 [ 653 ] DEBUG : Stderr: /root/.ivy2/local/org.apache.hadoop/hadoop-aws/3.3.4/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-07-28 09:45:16.322000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.322000 [ 653 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.323000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.323000 [ 653 ] DEBUG : Stderr: /root/.ivy2/local/org.apache.hadoop/hadoop-aws/3.3.4/jars/hadoop-aws.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:16.323000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.323000 [ 653 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-07-28 09:45:16.323000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.323000 [ 653 ] DEBUG : Stderr: https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-07-28 09:45:16.323000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.323000 [ 653 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.324000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.324000 [ 653 ] DEBUG : Stderr: https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:16.324000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.324000 [ 653 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-07-28 09:45:16.324000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.324000 [ 653 ] DEBUG : Stderr: https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-07-28 09:45:16.324000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.324000 [ 653 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.324000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.325000 [ 653 ] DEBUG : Stderr: https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:16.325000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.325000 [ 653 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-07-28 09:45:16.325000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.325000 [ 653 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:16.325000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.325000 [ 653 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-07-28 09:45:16.325000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.325000 [ 653 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:16.326000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.326000 [ 653 ] DEBUG : Stderr: module not found: io.delta#delta-spark_2.12;3.2.1 (cluster.py:147, run_and_check) 2025-07-28 09:45:16.326000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.326000 [ 653 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-07-28 09:45:16.326000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.326000 [ 653 ] DEBUG : Stderr: file:/root/.m2/repository/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-07-28 09:45:16.326000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.326000 [ 653 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.326000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.327000 [ 653 ] DEBUG : Stderr: file:/root/.m2/repository/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:16.327000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.327000 [ 653 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-07-28 09:45:16.327000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.327000 [ 653 ] DEBUG : Stderr: /root/.ivy2/local/io.delta/delta-spark_2.12/3.2.1/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-07-28 09:45:16.327000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.327000 [ 653 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.327000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.327000 [ 653 ] DEBUG : Stderr: /root/.ivy2/local/io.delta/delta-spark_2.12/3.2.1/jars/delta-spark_2.12.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:16.328000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.328000 [ 653 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-07-28 09:45:16.328000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.328000 [ 653 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-07-28 09:45:16.328000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.328000 [ 653 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.328000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.328000 [ 653 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:16.328000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.328000 [ 653 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-07-28 09:45:16.328000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.329000 [ 653 ] DEBUG : Stderr: https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-07-28 09:45:16.329000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.329000 [ 653 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.329000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.329000 [ 653 ] DEBUG : Stderr: https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:16.329000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.329000 [ 653 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-07-28 09:45:16.329000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.329000 [ 653 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:16.330000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.330000 [ 653 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-07-28 09:45:16.330000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.330000 [ 653 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:16.330000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.330000 [ 653 ] DEBUG : Stderr: module not found: io.unitycatalog#unitycatalog-spark_2.12;0.2.0 (cluster.py:147, run_and_check) 2025-07-28 09:45:16.330000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.330000 [ 653 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-07-28 09:45:16.330000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.330000 [ 653 ] DEBUG : Stderr: file:/root/.m2/repository/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-07-28 09:45:16.331000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.331000 [ 653 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.331000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.331000 [ 653 ] DEBUG : Stderr: file:/root/.m2/repository/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:16.331000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.331000 [ 653 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-07-28 09:45:16.331000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.331000 [ 653 ] DEBUG : Stderr: /root/.ivy2/local/io.unitycatalog/unitycatalog-spark_2.12/0.2.0/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-07-28 09:45:16.332000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.332000 [ 653 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.332000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.332000 [ 653 ] DEBUG : Stderr: /root/.ivy2/local/io.unitycatalog/unitycatalog-spark_2.12/0.2.0/jars/unitycatalog-spark_2.12.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:16.332000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.332000 [ 653 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-07-28 09:45:16.332000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.332000 [ 653 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-07-28 09:45:16.333000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.333000 [ 653 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.333000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.333000 [ 653 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:16.333000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.333000 [ 653 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-07-28 09:45:16.333000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.333000 [ 653 ] DEBUG : Stderr: https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-07-28 09:45:16.334000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.334000 [ 653 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.334000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.334000 [ 653 ] DEBUG : Stderr: https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:16.334000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.334000 [ 653 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.334000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.334000 [ 653 ] DEBUG : Stderr: :: UNRESOLVED DEPENDENCIES :: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.334000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.334000 [ 653 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.334000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.335000 [ 653 ] DEBUG : Stderr: :: org.apache.hadoop#hadoop-aws;3.3.4: not found (cluster.py:147, run_and_check) 2025-07-28 09:45:16.335000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.335000 [ 653 ] DEBUG : Stderr: :: io.delta#delta-spark_2.12;3.2.1: not found (cluster.py:147, run_and_check) 2025-07-28 09:45:16.335000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.335000 [ 653 ] DEBUG : Stderr: :: io.unitycatalog#unitycatalog-spark_2.12;0.2.0: not found (cluster.py:147, run_and_check) 2025-07-28 09:45:16.335000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.335000 [ 653 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.335000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.335000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.335000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:16.335000 [ 653 ] DEBUG : Stderr::: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS (cluster.py:147, run_and_check) 2025-07-28 09:45:16.336000 [ 653 ] DEBUG : Stderr:Exception in thread "main" java.lang.RuntimeException: [unresolved dependency: org.apache.hadoop#hadoop-aws;3.3.4: not found, unresolved dependency: io.delta#delta-spark_2.12;3.2.1: not found, unresolved dependency: io.unitycatalog#unitycatalog-spark_2.12;0.2.0: not found] (cluster.py:147, run_and_check) 2025-07-28 09:45:16.336000 [ 653 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1613) (cluster.py:147, run_and_check) 2025-07-28 09:45:16.336000 [ 653 ] DEBUG : Stderr: at org.apache.spark.util.DependencyUtils$.resolveMavenDependencies(DependencyUtils.scala:185) (cluster.py:147, run_and_check) 2025-07-28 09:45:16.336000 [ 653 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:339) (cluster.py:147, run_and_check) 2025-07-28 09:45:16.336000 [ 653 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:969) (cluster.py:147, run_and_check) 2025-07-28 09:45:16.336000 [ 653 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:199) (cluster.py:147, run_and_check) 2025-07-28 09:45:16.336000 [ 653 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:222) (cluster.py:147, run_and_check) 2025-07-28 09:45:16.336000 [ 653 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:91) (cluster.py:147, run_and_check) 2025-07-28 09:45:16.336000 [ 653 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1125) (cluster.py:147, run_and_check) 2025-07-28 09:45:16.336000 [ 653 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1134) (cluster.py:147, run_and_check) 2025-07-28 09:45:16.336000 [ 653 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) (cluster.py:147, run_and_check) 2025-07-28 09:45:16.336000 [ 653 ] DEBUG : Exitcode:1 (cluster.py:149, run_and_check) 2025-07-28 09:45:16.337000 [ 653 ] DEBUG : run container_id:roottestdatabasedelta-gw1-node1-1 detach:False nothrow:True cmd: ['bash', '-c', '\ncd /spark-3.5.4-bin-hadoop3 && bin/spark-sql --name "s3-uc-test" \\\n --master "local[*]" \\\n --packages "org.apache.hadoop:hadoop-aws:3.3.4,io.delta:delta-spark_2.12:3.2.1,io.unitycatalog:unitycatalog-spark_2.12:0.2.0" \\\n --conf "spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension" \\\n --conf "spark.sql.catalog.spark_catalog=io.unitycatalog.spark.UCSingleCatalog" \\\n --conf "spark.hadoop.fs.s3.impl=org.apache.hadoop.fs.s3a.S3AFileSystem" \\\n --conf "spark.sql.catalog.unity=io.unitycatalog.spark.UCSingleCatalog" \\\n --conf "spark.sql.catalog.unity.uri=http://localhost:8080" \\\n --conf "spark.sql.catalog.unity.token=" \\\n --conf "spark.sql.defaultCatalog=unity" \\\n -S -e "CREATE TABLE test_schema0.test_table0 (col1 int, col2 double) using Delta location \'/tmp/test_schema0/test_table0\';CREATE TABLE test_schema1.test_table1 (col1 int, col2 double) using Delta location \'/tmp/test_schema1/test_table1\';CREATE TABLE test_schema2.test_table2 (col1 int, col2 double) using Delta location \'/tmp/test_schema2/test_table2\';CREATE TABLE test_schema3.test_table3 (col1 int, col2 double) using Delta location \'/tmp/test_schema3/test_table3\';CREATE TABLE test_schema4.test_table4 (col1 int, col2 double) using Delta location \'/tmp/test_schema4/test_table4\';CREATE TABLE test_schema5.test_table5 (col1 int, col2 double) using Delta location \'/tmp/test_schema5/test_table5\';CREATE TABLE test_schema6.test_table6 (col1 int, col2 double) using Delta location \'/tmp/test_schema6/test_table6\';CREATE TABLE test_schema7.test_table7 (col1 int, col2 double) using Delta location \'/tmp/test_schema7/test_table7\';CREATE TABLE test_schema8.test_table8 (col1 int, col2 double) using Delta location \'/tmp/test_schema8/test_table8\';CREATE TABLE test_schema9.test_table9 (col1 int, col2 double) using Delta location \'/tmp/test_schema9/test_table9\'" | grep -v \'loading settings\'\n'] (cluster.py:2051, exec_in_container) 2025-07-28 09:45:16.337000 [ 653 ] DEBUG : Command:[docker exec roottestdatabasedelta-gw1-node1-1 bash -c cd /spark-3.5.4-bin-hadoop3 && bin/spark-sql --name "s3-uc-test" \ --master "local[*]" \ --packages "org.apache.hadoop:hadoop-aws:3.3.4,io.delta:delta-spark_2.12:3.2.1,io.unitycatalog:unitycatalog-spark_2.12:0.2.0" \ --conf "spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension" \ --conf "spark.sql.catalog.spark_catalog=io.unitycatalog.spark.UCSingleCatalog" \ --conf "spark.hadoop.fs.s3.impl=org.apache.hadoop.fs.s3a.S3AFileSystem" \ --conf "spark.sql.catalog.unity=io.unitycatalog.spark.UCSingleCatalog" \ --conf "spark.sql.catalog.unity.uri=http://localhost:8080" \ --conf "spark.sql.catalog.unity.token=" \ --conf "spark.sql.defaultCatalog=unity" \ -S -e "CREATE TABLE test_schema0.test_table0 (col1 int, col2 double) using Delta location '/tmp/test_schema0/test_table0';CREATE TABLE test_schema1.test_table1 (col1 int, col2 double) using Delta location '/tmp/test_schema1/test_table1';CREATE TABLE test_schema2.test_table2 (col1 int, col2 double) using Delta location '/tmp/test_schema2/test_table2';CREATE TABLE test_schema3.test_table3 (col1 int, col2 double) using Delta location '/tmp/test_schema3/test_table3';CREATE TABLE test_schema4.test_table4 (col1 int, col2 double) using Delta location '/tmp/test_schema4/test_table4';CREATE TABLE test_schema5.test_table5 (col1 int, col2 double) using Delta location '/tmp/test_schema5/test_table5';CREATE TABLE test_schema6.test_table6 (col1 int, col2 double) using Delta location '/tmp/test_schema6/test_table6';CREATE TABLE test_schema7.test_table7 (col1 int, col2 double) using Delta location '/tmp/test_schema7/test_table7';CREATE TABLE test_schema8.test_table8 (col1 int, col2 double) using Delta location '/tmp/test_schema8/test_table8';CREATE TABLE test_schema9.test_table9 (col1 int, col2 double) using Delta location '/tmp/test_schema9/test_table9'" | grep -v 'loading settings' ] (cluster.py:121, run_and_check) 2025-07-28 09:45:22.457000 [ 653 ] DEBUG : Stderr:Ivy Default Cache set to: /root/.ivy2/cache (cluster.py:147, run_and_check) 2025-07-28 09:45:22.458000 [ 653 ] DEBUG : Stderr:The jars for the packages stored in: /root/.ivy2/jars (cluster.py:147, run_and_check) 2025-07-28 09:45:22.458000 [ 653 ] DEBUG : Stderr:org.apache.hadoop#hadoop-aws added as a dependency (cluster.py:147, run_and_check) 2025-07-28 09:45:22.458000 [ 653 ] DEBUG : Stderr:io.delta#delta-spark_2.12 added as a dependency (cluster.py:147, run_and_check) 2025-07-28 09:45:22.458000 [ 653 ] DEBUG : Stderr:io.unitycatalog#unitycatalog-spark_2.12 added as a dependency (cluster.py:147, run_and_check) 2025-07-28 09:45:22.458000 [ 653 ] DEBUG : Stderr::: resolving dependencies :: org.apache.spark#spark-submit-parent-3e786770-6d44-47c5-96b0-b1cfabc485ac;1.0 (cluster.py:147, run_and_check) 2025-07-28 09:45:22.458000 [ 653 ] DEBUG : Stderr: confs: [default] (cluster.py:147, run_and_check) 2025-07-28 09:45:22.458000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:45:22.458000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:45:22.459000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:45:22.459000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:45:22.459000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:45:22.459000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:45:22.459000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:45:22.459000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:45:22.459000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:45:22.459000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:45:22.459000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:45:22.460000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:45:22.460000 [ 653 ] DEBUG : Stderr::: resolution report :: resolve 4272ms :: artifacts dl 0ms (cluster.py:147, run_and_check) 2025-07-28 09:45:22.460000 [ 653 ] DEBUG : Stderr: :: modules in use: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.460000 [ 653 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-07-28 09:45:22.460000 [ 653 ] DEBUG : Stderr: | | modules || artifacts | (cluster.py:147, run_and_check) 2025-07-28 09:45:22.460000 [ 653 ] DEBUG : Stderr: | conf | number| search|dwnlded|evicted|| number|dwnlded| (cluster.py:147, run_and_check) 2025-07-28 09:45:22.460000 [ 653 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-07-28 09:45:22.460000 [ 653 ] DEBUG : Stderr: | default | 3 | 0 | 0 | 0 || 0 | 0 | (cluster.py:147, run_and_check) 2025-07-28 09:45:22.460000 [ 653 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-07-28 09:45:22.461000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.461000 [ 653 ] DEBUG : Stderr::: problems summary :: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.461000 [ 653 ] DEBUG : Stderr::::: WARNINGS (cluster.py:147, run_and_check) 2025-07-28 09:45:22.461000 [ 653 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-07-28 09:45:22.461000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.461000 [ 653 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:22.461000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.461000 [ 653 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-07-28 09:45:22.461000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.462000 [ 653 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:22.462000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.462000 [ 653 ] DEBUG : Stderr: module not found: org.apache.hadoop#hadoop-aws;3.3.4 (cluster.py:147, run_and_check) 2025-07-28 09:45:22.462000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.462000 [ 653 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-07-28 09:45:22.462000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.462000 [ 653 ] DEBUG : Stderr: file:/root/.m2/repository/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-07-28 09:45:22.462000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.463000 [ 653 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.463000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.463000 [ 653 ] DEBUG : Stderr: file:/root/.m2/repository/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:22.463000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.463000 [ 653 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-07-28 09:45:22.463000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.463000 [ 653 ] DEBUG : Stderr: /root/.ivy2/local/org.apache.hadoop/hadoop-aws/3.3.4/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-07-28 09:45:22.463000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.463000 [ 653 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.464000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.464000 [ 653 ] DEBUG : Stderr: /root/.ivy2/local/org.apache.hadoop/hadoop-aws/3.3.4/jars/hadoop-aws.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:22.464000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.464000 [ 653 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-07-28 09:45:22.464000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.464000 [ 653 ] DEBUG : Stderr: https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-07-28 09:45:22.464000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.464000 [ 653 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.464000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.464000 [ 653 ] DEBUG : Stderr: https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:22.465000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.465000 [ 653 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-07-28 09:45:22.465000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.465000 [ 653 ] DEBUG : Stderr: https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-07-28 09:45:22.465000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.465000 [ 653 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.465000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.465000 [ 653 ] DEBUG : Stderr: https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:22.465000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.465000 [ 653 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-07-28 09:45:22.466000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.466000 [ 653 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:22.466000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.466000 [ 653 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-07-28 09:45:22.466000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.466000 [ 653 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:22.466000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.466000 [ 653 ] DEBUG : Stderr: module not found: io.delta#delta-spark_2.12;3.2.1 (cluster.py:147, run_and_check) 2025-07-28 09:45:22.466000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.467000 [ 653 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-07-28 09:45:22.467000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.467000 [ 653 ] DEBUG : Stderr: file:/root/.m2/repository/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-07-28 09:45:22.467000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.467000 [ 653 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.467000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.467000 [ 653 ] DEBUG : Stderr: file:/root/.m2/repository/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:22.467000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.467000 [ 653 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-07-28 09:45:22.468000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.468000 [ 653 ] DEBUG : Stderr: /root/.ivy2/local/io.delta/delta-spark_2.12/3.2.1/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-07-28 09:45:22.468000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.468000 [ 653 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.468000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.468000 [ 653 ] DEBUG : Stderr: /root/.ivy2/local/io.delta/delta-spark_2.12/3.2.1/jars/delta-spark_2.12.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:22.468000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.468000 [ 653 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-07-28 09:45:22.468000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.468000 [ 653 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-07-28 09:45:22.469000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.469000 [ 653 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.469000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.469000 [ 653 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:22.469000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.469000 [ 653 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-07-28 09:45:22.469000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.469000 [ 653 ] DEBUG : Stderr: https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-07-28 09:45:22.469000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.470000 [ 653 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.470000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.470000 [ 653 ] DEBUG : Stderr: https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:22.470000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.470000 [ 653 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-07-28 09:45:22.470000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.470000 [ 653 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:22.470000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.470000 [ 653 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-07-28 09:45:22.471000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.471000 [ 653 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:22.471000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.471000 [ 653 ] DEBUG : Stderr: module not found: io.unitycatalog#unitycatalog-spark_2.12;0.2.0 (cluster.py:147, run_and_check) 2025-07-28 09:45:22.471000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.471000 [ 653 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-07-28 09:45:22.471000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.471000 [ 653 ] DEBUG : Stderr: file:/root/.m2/repository/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-07-28 09:45:22.471000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.471000 [ 653 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.472000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.472000 [ 653 ] DEBUG : Stderr: file:/root/.m2/repository/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:22.472000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.472000 [ 653 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-07-28 09:45:22.472000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.472000 [ 653 ] DEBUG : Stderr: /root/.ivy2/local/io.unitycatalog/unitycatalog-spark_2.12/0.2.0/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-07-28 09:45:22.472000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.472000 [ 653 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.472000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.473000 [ 653 ] DEBUG : Stderr: /root/.ivy2/local/io.unitycatalog/unitycatalog-spark_2.12/0.2.0/jars/unitycatalog-spark_2.12.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:22.473000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.473000 [ 653 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-07-28 09:45:22.473000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.473000 [ 653 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-07-28 09:45:22.474000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.474000 [ 653 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.474000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.474000 [ 653 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:22.474000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.474000 [ 653 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-07-28 09:45:22.474000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.474000 [ 653 ] DEBUG : Stderr: https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-07-28 09:45:22.474000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.475000 [ 653 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.475000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.475000 [ 653 ] DEBUG : Stderr: https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:22.475000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.475000 [ 653 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.475000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.475000 [ 653 ] DEBUG : Stderr: :: UNRESOLVED DEPENDENCIES :: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.475000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.475000 [ 653 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.476000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.476000 [ 653 ] DEBUG : Stderr: :: org.apache.hadoop#hadoop-aws;3.3.4: not found (cluster.py:147, run_and_check) 2025-07-28 09:45:22.476000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.476000 [ 653 ] DEBUG : Stderr: :: io.delta#delta-spark_2.12;3.2.1: not found (cluster.py:147, run_and_check) 2025-07-28 09:45:22.476000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.476000 [ 653 ] DEBUG : Stderr: :: io.unitycatalog#unitycatalog-spark_2.12;0.2.0: not found (cluster.py:147, run_and_check) 2025-07-28 09:45:22.476000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.476000 [ 653 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.476000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.476000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.477000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:22.477000 [ 653 ] DEBUG : Stderr::: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS (cluster.py:147, run_and_check) 2025-07-28 09:45:22.477000 [ 653 ] DEBUG : Stderr:Exception in thread "main" java.lang.RuntimeException: [unresolved dependency: org.apache.hadoop#hadoop-aws;3.3.4: not found, unresolved dependency: io.delta#delta-spark_2.12;3.2.1: not found, unresolved dependency: io.unitycatalog#unitycatalog-spark_2.12;0.2.0: not found] (cluster.py:147, run_and_check) 2025-07-28 09:45:22.477000 [ 653 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1613) (cluster.py:147, run_and_check) 2025-07-28 09:45:22.477000 [ 653 ] DEBUG : Stderr: at org.apache.spark.util.DependencyUtils$.resolveMavenDependencies(DependencyUtils.scala:185) (cluster.py:147, run_and_check) 2025-07-28 09:45:22.477000 [ 653 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:339) (cluster.py:147, run_and_check) 2025-07-28 09:45:22.477000 [ 653 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:969) (cluster.py:147, run_and_check) 2025-07-28 09:45:22.477000 [ 653 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:199) (cluster.py:147, run_and_check) 2025-07-28 09:45:22.477000 [ 653 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:222) (cluster.py:147, run_and_check) 2025-07-28 09:45:22.478000 [ 653 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:91) (cluster.py:147, run_and_check) 2025-07-28 09:45:22.478000 [ 653 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1125) (cluster.py:147, run_and_check) 2025-07-28 09:45:22.478000 [ 653 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1134) (cluster.py:147, run_and_check) 2025-07-28 09:45:22.478000 [ 653 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) (cluster.py:147, run_and_check) 2025-07-28 09:45:22.478000 [ 653 ] DEBUG : Exitcode:1 (cluster.py:149, run_and_check) 2025-07-28 09:45:22.478000 [ 653 ] DEBUG : run container_id:roottestdatabasedelta-gw1-node1-1 detach:False nothrow:True cmd: ['bash', '-c', '\ncd /spark-3.5.4-bin-hadoop3 && bin/spark-sql --name "s3-uc-test" \\\n --master "local[*]" \\\n --packages "org.apache.hadoop:hadoop-aws:3.3.4,io.delta:delta-spark_2.12:3.2.1,io.unitycatalog:unitycatalog-spark_2.12:0.2.0" \\\n --conf "spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension" \\\n --conf "spark.sql.catalog.spark_catalog=io.unitycatalog.spark.UCSingleCatalog" \\\n --conf "spark.hadoop.fs.s3.impl=org.apache.hadoop.fs.s3a.S3AFileSystem" \\\n --conf "spark.sql.catalog.unity=io.unitycatalog.spark.UCSingleCatalog" \\\n --conf "spark.sql.catalog.unity.uri=http://localhost:8080" \\\n --conf "spark.sql.catalog.unity.token=" \\\n --conf "spark.sql.defaultCatalog=unity" \\\n -S -e "INSERT INTO test_schema0.test_table0 VALUES (0, 0.0);INSERT INTO test_schema1.test_table1 VALUES (1, 1.0);INSERT INTO test_schema2.test_table2 VALUES (2, 2.0);INSERT INTO test_schema3.test_table3 VALUES (3, 3.0);INSERT INTO test_schema4.test_table4 VALUES (4, 4.0);INSERT INTO test_schema5.test_table5 VALUES (5, 5.0);INSERT INTO test_schema6.test_table6 VALUES (6, 6.0);INSERT INTO test_schema7.test_table7 VALUES (7, 7.0);INSERT INTO test_schema8.test_table8 VALUES (8, 8.0);INSERT INTO test_schema9.test_table9 VALUES (9, 9.0)" | grep -v \'loading settings\'\n'] (cluster.py:2051, exec_in_container) 2025-07-28 09:45:22.478000 [ 653 ] DEBUG : Command:[docker exec roottestdatabasedelta-gw1-node1-1 bash -c cd /spark-3.5.4-bin-hadoop3 && bin/spark-sql --name "s3-uc-test" \ --master "local[*]" \ --packages "org.apache.hadoop:hadoop-aws:3.3.4,io.delta:delta-spark_2.12:3.2.1,io.unitycatalog:unitycatalog-spark_2.12:0.2.0" \ --conf "spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension" \ --conf "spark.sql.catalog.spark_catalog=io.unitycatalog.spark.UCSingleCatalog" \ --conf "spark.hadoop.fs.s3.impl=org.apache.hadoop.fs.s3a.S3AFileSystem" \ --conf "spark.sql.catalog.unity=io.unitycatalog.spark.UCSingleCatalog" \ --conf "spark.sql.catalog.unity.uri=http://localhost:8080" \ --conf "spark.sql.catalog.unity.token=" \ --conf "spark.sql.defaultCatalog=unity" \ -S -e "INSERT INTO test_schema0.test_table0 VALUES (0, 0.0);INSERT INTO test_schema1.test_table1 VALUES (1, 1.0);INSERT INTO test_schema2.test_table2 VALUES (2, 2.0);INSERT INTO test_schema3.test_table3 VALUES (3, 3.0);INSERT INTO test_schema4.test_table4 VALUES (4, 4.0);INSERT INTO test_schema5.test_table5 VALUES (5, 5.0);INSERT INTO test_schema6.test_table6 VALUES (6, 6.0);INSERT INTO test_schema7.test_table7 VALUES (7, 7.0);INSERT INTO test_schema8.test_table8 VALUES (8, 8.0);INSERT INTO test_schema9.test_table9 VALUES (9, 9.0)" | grep -v 'loading settings' ] (cluster.py:121, run_and_check) 2025-07-28 09:45:28.584000 [ 653 ] DEBUG : Stderr:Ivy Default Cache set to: /root/.ivy2/cache (cluster.py:147, run_and_check) 2025-07-28 09:45:28.584000 [ 653 ] DEBUG : Stderr:The jars for the packages stored in: /root/.ivy2/jars (cluster.py:147, run_and_check) 2025-07-28 09:45:28.584000 [ 653 ] DEBUG : Stderr:org.apache.hadoop#hadoop-aws added as a dependency (cluster.py:147, run_and_check) 2025-07-28 09:45:28.584000 [ 653 ] DEBUG : Stderr:io.delta#delta-spark_2.12 added as a dependency (cluster.py:147, run_and_check) 2025-07-28 09:45:28.584000 [ 653 ] DEBUG : Stderr:io.unitycatalog#unitycatalog-spark_2.12 added as a dependency (cluster.py:147, run_and_check) 2025-07-28 09:45:28.584000 [ 653 ] DEBUG : Stderr::: resolving dependencies :: org.apache.spark#spark-submit-parent-494cc2b4-fddc-4ea2-8d6f-b992cc92150f;1.0 (cluster.py:147, run_and_check) 2025-07-28 09:45:28.585000 [ 653 ] DEBUG : Stderr: confs: [default] (cluster.py:147, run_and_check) 2025-07-28 09:45:28.585000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:45:28.585000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:45:28.585000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:45:28.585000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:45:28.585000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:45:28.585000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:45:28.585000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:45:28.585000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:45:28.586000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:45:28.586000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:45:28.586000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:45:28.586000 [ 653 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-07-28 09:45:28.586000 [ 653 ] DEBUG : Stderr::: resolution report :: resolve 4282ms :: artifacts dl 0ms (cluster.py:147, run_and_check) 2025-07-28 09:45:28.586000 [ 653 ] DEBUG : Stderr: :: modules in use: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.586000 [ 653 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-07-28 09:45:28.586000 [ 653 ] DEBUG : Stderr: | | modules || artifacts | (cluster.py:147, run_and_check) 2025-07-28 09:45:28.586000 [ 653 ] DEBUG : Stderr: | conf | number| search|dwnlded|evicted|| number|dwnlded| (cluster.py:147, run_and_check) 2025-07-28 09:45:28.587000 [ 653 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-07-28 09:45:28.587000 [ 653 ] DEBUG : Stderr: | default | 3 | 0 | 0 | 0 || 0 | 0 | (cluster.py:147, run_and_check) 2025-07-28 09:45:28.587000 [ 653 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-07-28 09:45:28.587000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.587000 [ 653 ] DEBUG : Stderr::: problems summary :: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.587000 [ 653 ] DEBUG : Stderr::::: WARNINGS (cluster.py:147, run_and_check) 2025-07-28 09:45:28.587000 [ 653 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-07-28 09:45:28.587000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.587000 [ 653 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:28.588000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.588000 [ 653 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-07-28 09:45:28.588000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.588000 [ 653 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:28.588000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.588000 [ 653 ] DEBUG : Stderr: module not found: org.apache.hadoop#hadoop-aws;3.3.4 (cluster.py:147, run_and_check) 2025-07-28 09:45:28.588000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.588000 [ 653 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-07-28 09:45:28.588000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.589000 [ 653 ] DEBUG : Stderr: file:/root/.m2/repository/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-07-28 09:45:28.589000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.589000 [ 653 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.589000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.589000 [ 653 ] DEBUG : Stderr: file:/root/.m2/repository/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:28.589000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.589000 [ 653 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-07-28 09:45:28.589000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.590000 [ 653 ] DEBUG : Stderr: /root/.ivy2/local/org.apache.hadoop/hadoop-aws/3.3.4/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-07-28 09:45:28.590000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.590000 [ 653 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.590000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.590000 [ 653 ] DEBUG : Stderr: /root/.ivy2/local/org.apache.hadoop/hadoop-aws/3.3.4/jars/hadoop-aws.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:28.590000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.590000 [ 653 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-07-28 09:45:28.590000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.590000 [ 653 ] DEBUG : Stderr: https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-07-28 09:45:28.591000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.591000 [ 653 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.591000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.591000 [ 653 ] DEBUG : Stderr: https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:28.591000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.591000 [ 653 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-07-28 09:45:28.591000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.591000 [ 653 ] DEBUG : Stderr: https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-07-28 09:45:28.591000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.592000 [ 653 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.592000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.592000 [ 653 ] DEBUG : Stderr: https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:28.592000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.592000 [ 653 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-07-28 09:45:28.592000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.592000 [ 653 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:28.592000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.593000 [ 653 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-07-28 09:45:28.593000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.593000 [ 653 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:28.593000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.593000 [ 653 ] DEBUG : Stderr: module not found: io.delta#delta-spark_2.12;3.2.1 (cluster.py:147, run_and_check) 2025-07-28 09:45:28.593000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.593000 [ 653 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-07-28 09:45:28.593000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.593000 [ 653 ] DEBUG : Stderr: file:/root/.m2/repository/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-07-28 09:45:28.593000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.594000 [ 653 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.594000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.594000 [ 653 ] DEBUG : Stderr: file:/root/.m2/repository/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:28.594000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.594000 [ 653 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-07-28 09:45:28.594000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.594000 [ 653 ] DEBUG : Stderr: /root/.ivy2/local/io.delta/delta-spark_2.12/3.2.1/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-07-28 09:45:28.594000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.595000 [ 653 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.595000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.595000 [ 653 ] DEBUG : Stderr: /root/.ivy2/local/io.delta/delta-spark_2.12/3.2.1/jars/delta-spark_2.12.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:28.595000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.595000 [ 653 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-07-28 09:45:28.595000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.595000 [ 653 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-07-28 09:45:28.595000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.596000 [ 653 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.596000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.596000 [ 653 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:28.596000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.596000 [ 653 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-07-28 09:45:28.596000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.596000 [ 653 ] DEBUG : Stderr: https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-07-28 09:45:28.596000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.596000 [ 653 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.597000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.597000 [ 653 ] DEBUG : Stderr: https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:28.597000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.597000 [ 653 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-07-28 09:45:28.597000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.597000 [ 653 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:28.597000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.597000 [ 653 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-07-28 09:45:28.598000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.598000 [ 653 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:28.598000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.598000 [ 653 ] DEBUG : Stderr: module not found: io.unitycatalog#unitycatalog-spark_2.12;0.2.0 (cluster.py:147, run_and_check) 2025-07-28 09:45:28.598000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.598000 [ 653 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-07-28 09:45:28.598000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.598000 [ 653 ] DEBUG : Stderr: file:/root/.m2/repository/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-07-28 09:45:28.598000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.599000 [ 653 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.599000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.599000 [ 653 ] DEBUG : Stderr: file:/root/.m2/repository/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:28.599000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.599000 [ 653 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-07-28 09:45:28.599000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.599000 [ 653 ] DEBUG : Stderr: /root/.ivy2/local/io.unitycatalog/unitycatalog-spark_2.12/0.2.0/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-07-28 09:45:28.599000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.599000 [ 653 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.600000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.600000 [ 653 ] DEBUG : Stderr: /root/.ivy2/local/io.unitycatalog/unitycatalog-spark_2.12/0.2.0/jars/unitycatalog-spark_2.12.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:28.600000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.600000 [ 653 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-07-28 09:45:28.600000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.600000 [ 653 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-07-28 09:45:28.600000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.600000 [ 653 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.600000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.601000 [ 653 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:28.601000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.601000 [ 653 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-07-28 09:45:28.601000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.601000 [ 653 ] DEBUG : Stderr: https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-07-28 09:45:28.601000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.601000 [ 653 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.601000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.601000 [ 653 ] DEBUG : Stderr: https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-07-28 09:45:28.602000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.602000 [ 653 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.602000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.602000 [ 653 ] DEBUG : Stderr: :: UNRESOLVED DEPENDENCIES :: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.602000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.602000 [ 653 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.602000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.602000 [ 653 ] DEBUG : Stderr: :: org.apache.hadoop#hadoop-aws;3.3.4: not found (cluster.py:147, run_and_check) 2025-07-28 09:45:28.602000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.603000 [ 653 ] DEBUG : Stderr: :: io.delta#delta-spark_2.12;3.2.1: not found (cluster.py:147, run_and_check) 2025-07-28 09:45:28.603000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.603000 [ 653 ] DEBUG : Stderr: :: io.unitycatalog#unitycatalog-spark_2.12;0.2.0: not found (cluster.py:147, run_and_check) 2025-07-28 09:45:28.603000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.603000 [ 653 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.603000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.603000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.603000 [ 653 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-07-28 09:45:28.603000 [ 653 ] DEBUG : Stderr::: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS (cluster.py:147, run_and_check) 2025-07-28 09:45:28.604000 [ 653 ] DEBUG : Stderr:Exception in thread "main" java.lang.RuntimeException: [unresolved dependency: org.apache.hadoop#hadoop-aws;3.3.4: not found, unresolved dependency: io.delta#delta-spark_2.12;3.2.1: not found, unresolved dependency: io.unitycatalog#unitycatalog-spark_2.12;0.2.0: not found] (cluster.py:147, run_and_check) 2025-07-28 09:45:28.604000 [ 653 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1613) (cluster.py:147, run_and_check) 2025-07-28 09:45:28.604000 [ 653 ] DEBUG : Stderr: at org.apache.spark.util.DependencyUtils$.resolveMavenDependencies(DependencyUtils.scala:185) (cluster.py:147, run_and_check) 2025-07-28 09:45:28.604000 [ 653 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:339) (cluster.py:147, run_and_check) 2025-07-28 09:45:28.604000 [ 653 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:969) (cluster.py:147, run_and_check) 2025-07-28 09:45:28.604000 [ 653 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:199) (cluster.py:147, run_and_check) 2025-07-28 09:45:28.604000 [ 653 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:222) (cluster.py:147, run_and_check) 2025-07-28 09:45:28.604000 [ 653 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:91) (cluster.py:147, run_and_check) 2025-07-28 09:45:28.605000 [ 653 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1125) (cluster.py:147, run_and_check) 2025-07-28 09:45:28.605000 [ 653 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1134) (cluster.py:147, run_and_check) 2025-07-28 09:45:28.605000 [ 653 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) (cluster.py:147, run_and_check) 2025-07-28 09:45:28.605000 [ 653 ] DEBUG : Exitcode:1 (cluster.py:149, run_and_check) 2025-07-28 09:45:28.605000 [ 653 ] DEBUG : Executing query create database multi_schema_test engine DataLakeCatalog('http://localhost:8080/api/2.1/unity-catalog') settings warehouse = 'unity', catalog_type='unity', vended_credentials=false on node1 (cluster.py:3648, query) 2025-07-28 09:45:28.922000 [ 653 ] DEBUG : Executing query SHOW TABLES FROM multi_schema_test LIKE 'test_schema%' on node1 (cluster.py:3648, query) 2025-07-28 09:45:29.540000 [ 653 ] DEBUG : Executing query SELECT col1 FROM multi_schema_test.`` on node1 (cluster.py:3648, query) ---------------------------- Captured log teardown ----------------------------- 2025-07-28 09:45:30.206000 [ 653 ] DEBUG : Command:[docker compose --env-file /ClickHouse/tests/integration/test_database_delta/_instances-1-gw1/.env --project-name roottestdatabasedelta-gw1 --file /ClickHouse/tests/integration/test_database_delta/_instances-1-gw1/node1/docker-compose.yml stop --timeout 20] (cluster.py:121, run_and_check) 2025-07-28 09:45:35.501000 [ 653 ] DEBUG : Stderr: Container roottestdatabasedelta-gw1-node1-1 Stopping (cluster.py:147, run_and_check) 2025-07-28 09:45:35.502000 [ 653 ] DEBUG : Stderr: Container roottestdatabasedelta-gw1-node1-1 Stopped (cluster.py:147, run_and_check) 2025-07-28 09:45:35.502000 [ 653 ] DEBUG : Command:[bash -c [ -f /ClickHouse/tests/integration/test_database_delta/_instances-1-gw1/node1/logs/stderr.log ] && zgrep -aH "==================" /ClickHouse/tests/integration/test_database_delta/_instances-1-gw1/node1/logs/stderr.log* | ( [ -z "" ] && cat || grep -v "$" ) || true] (cluster.py:121, run_and_check) 2025-07-28 09:45:35.518000 [ 653 ] DEBUG : Command:[docker compose --env-file /ClickHouse/tests/integration/test_database_delta/_instances-1-gw1/.env --project-name roottestdatabasedelta-gw1 --file /ClickHouse/tests/integration/test_database_delta/_instances-1-gw1/node1/docker-compose.yml down --volumes] (cluster.py:121, run_and_check) 2025-07-28 09:45:36.041000 [ 653 ] DEBUG : Stderr: Container roottestdatabasedelta-gw1-node1-1 Stopping (cluster.py:147, run_and_check) 2025-07-28 09:45:36.041000 [ 653 ] DEBUG : Stderr: Container roottestdatabasedelta-gw1-node1-1 Stopped (cluster.py:147, run_and_check) 2025-07-28 09:45:36.041000 [ 653 ] DEBUG : Stderr: Container roottestdatabasedelta-gw1-node1-1 Removing (cluster.py:147, run_and_check) 2025-07-28 09:45:36.041000 [ 653 ] DEBUG : Stderr: Container roottestdatabasedelta-gw1-node1-1 Removed (cluster.py:147, run_and_check) 2025-07-28 09:45:36.042000 [ 653 ] DEBUG : Stderr: Network roottestdatabasedelta-gw1_default Removing (cluster.py:147, run_and_check) 2025-07-28 09:45:36.042000 [ 653 ] DEBUG : Stderr: Network roottestdatabasedelta-gw1_default Removed (cluster.py:147, run_and_check) 2025-07-28 09:45:36.042000 [ 653 ] DEBUG : Cleanup called (cluster.py:851, cleanup) 2025-07-28 09:45:36.074000 [ 653 ] DEBUG : Docker networks for project roottestdatabasedelta-gw1 are NETWORK ID NAME DRIVER SCOPE (cluster.py:830, print_all_docker_pieces) 2025-07-28 09:45:36.102000 [ 653 ] DEBUG : Docker containers for project roottestdatabasedelta-gw1 are CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES (cluster.py:838, print_all_docker_pieces) 2025-07-28 09:45:36.139000 [ 653 ] DEBUG : Docker volumes for project roottestdatabasedelta-gw1 are DRIVER VOLUME NAME (cluster.py:846, print_all_docker_pieces) 2025-07-28 09:45:36.139000 [ 653 ] DEBUG : Command:[docker container list --all --filter name='^/roottestdatabasedelta-gw1-.*-1$' --format '{{.ID}}:{{.Names}}'] (cluster.py:121, run_and_check) 2025-07-28 09:45:36.170000 [ 653 ] DEBUG : Unstopped containers: {} (cluster.py:865, cleanup) 2025-07-28 09:45:36.171000 [ 653 ] DEBUG : No running containers for project: roottestdatabasedelta-gw1 (cluster.py:879, cleanup) 2025-07-28 09:45:36.171000 [ 653 ] DEBUG : Trying to prune unused networks... (cluster.py:885, cleanup) 2025-07-28 09:45:36.199000 [ 653 ] DEBUG : Trying to prune unused images... (cluster.py:901, cleanup) 2025-07-28 09:45:36.200000 [ 653 ] DEBUG : Command:[docker image prune -f] (cluster.py:121, run_and_check) 2025-07-28 09:45:36.243000 [ 653 ] DEBUG : Stdout:Total reclaimed space: 0B (cluster.py:145, run_and_check) 2025-07-28 09:45:36.244000 [ 653 ] DEBUG : Images pruned (cluster.py:904, cleanup) 2025-07-28 09:45:36.244000 [ 653 ] DEBUG : Trying to prune unused volumes... (cluster.py:910, cleanup) 2025-07-28 09:45:36.244000 [ 653 ] DEBUG : Command:[docker volume ls | wc -l] (cluster.py:121, run_and_check) 2025-07-28 09:45:36.274000 [ 653 ] DEBUG : Stdout:1 (cluster.py:145, run_and_check) 2025-07-28 09:45:36.274000 [ 653 ] DEBUG : Volumes pruned: 1 (cluster.py:915, cleanup) ----------------- generated report log file: parallel0_1.jsonl ----------------- ============================== slowest durations =============================== 20.33s call test_database_delta/test.py::test_complex_table_schema 19.81s call test_database_delta/test.py::test_multiple_schemes_tables 15.40s setup test_database_delta/test.py::test_complex_table_schema 7.81s call test_database_delta/test.py::test_embedded_database_and_tables 6.07s teardown test_database_delta/test.py::test_multiple_schemes_tables 0.00s teardown test_database_delta/test.py::test_complex_table_schema 0.00s teardown test_database_delta/test.py::test_embedded_database_and_tables 0.00s setup test_database_delta/test.py::test_embedded_database_and_tables 0.00s setup test_database_delta/test.py::test_multiple_schemes_tables =========================== short test summary info ============================ FAILED test_database_delta/test.py::test_complex_table_schema - helpers.clien... FAILED test_database_delta/test.py::test_embedded_database_and_tables - Excep... FAILED test_database_delta/test.py::test_multiple_schemes_tables - helpers.cl... ========================= 3 failed in 72.42s (0:01:12) ========================= Traceback (most recent call last): File "/home/ubuntu/_work/ClickHouse/ClickHouse/tests/integration/./runner", line 492, in subprocess.check_call(cmd, shell=True, bufsize=0) File "/usr/lib/python3.10/subprocess.py", line 369, in check_call raise CalledProcessError(retcode, cmd) subprocess.CalledProcessError: Command 'docker run --rm --name clickhouse_integration_tests_42qml6 --privileged --dns-search='.' --memory=30709026816 --security-opt seccomp=unconfined --cap-add=SYS_PTRACE --volume=/home/ubuntu/_work/_temp/test/build/clickhouse:/clickhouse --volume=/home/ubuntu/_work/ClickHouse/ClickHouse/programs/server:/clickhouse-config --volume=/home/ubuntu/_work/ClickHouse/ClickHouse/tests/integration:/ClickHouse/tests/integration --volume=/home/ubuntu/_work/ClickHouse/ClickHouse/utils/backupview:/ClickHouse/utils/backupview --volume=/home/ubuntu/_work/ClickHouse/ClickHouse/utils/grpc-client/pb2:/ClickHouse/utils/grpc-client/pb2 --volume=/run:/run/host:ro --volume=clickhouse_integration_tests_volume:/var/lib/docker -e DOCKER_DOTNET_CLIENT_TAG=11de0b29a15d -e DOCKER_HELPER_TAG=5dc43a6382f0 -e DOCKER_BASE_TAG=5ccda723c1fc -e DOCKER_KERBEROS_KDC_TAG=9391ecdee8d7 -e DOCKER_MYSQL_GOLANG_CLIENT_TAG=9bec2a638e6e -e DOCKER_MYSQL_JAVA_CLIENT_TAG=766bff31cfe4 -e DOCKER_MYSQL_JS_CLIENT_TAG=41ba7c2ec2a1 -e DOCKER_MYSQL_PHP_CLIENT_TAG=88be89c1e3b6 -e DOCKER_NGINX_DAV_TAG=b55ac9cd7519 -e DOCKER_POSTGRESQL_JAVA_CLIENT_TAG=a4eff5c7f4d6 -e DOCKER_PYTHON_BOTTLE_TAG=d862517635bf -e DOCKER_CLIENT_TIMEOUT=300 -e COMPOSE_HTTP_TIMEOUT=600 -e CLICKHOUSE_USE_OLD_ANALYZER=1 -e PYTHONUNBUFFERED=1 -e PYTEST_ADDOPTS="--dist=loadfile -n 10 -rfEps --run-id=1 --color=no --durations=0 --report-log=parallel0_1.jsonl --report-log-exclude-logs-on-passed-tests test_database_delta/test.py::test_complex_table_schema test_database_delta/test.py::test_embedded_database_and_tables test_database_delta/test.py::test_multiple_schemes_tables -vvv " altinityinfra/integration-tests-runner:226bfaf75ac1 ' returned non-zero exit status 1.